Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
What's next after locating exoplanets? Maybe imaging the surface of sterile planets and moons.
Posted 21 August 2019 - 02:45 AM
Why not exoplanets, themselves? Well, that would be worthwhile, if they didn't have an atmosphere that changes very quickly. You see, in order to obtain a high-resolution image of a planet, you have to take long exposure shots -- and, unfortunately, if a planet's atmosphere changes rapidly, you won't be able to get those nice shots. An earth-like planet lightyears away will come out looking like a smooth, almost uniform, light-blue sphere; we wouldn't be able to get a snapshot with nice, crisp clouds. But we might could image its moon and see the craters -- or even a moon base, if one exists!
In fact, using a large number of long exposures, a moon without an atmosphere, that doesn't change very much over eons, might be something we could image reasonably sharply. We'd have to have a pretty good idea of its location, and how it is oriented. And then using enough long exposures, it should be possible to "average away" a lot of the "noise" in the image. There will be an issue with light pollution from the parent star to overcome, but I think there should still be enough image signal there to work with, that stands above the star-light.
You might not even need to know how the moon is oriented, in fact. There might be a way to use some optimization to do this -- I'm not sure. Say, apply certain "low-complexity transformations" (e.g. rotations, translations, or simple warps) to the images, so as to maximize the correlations between them. This would have the effect of artificially "rotating" the moon to bring all the images into alignment with each other.
I'm not saying this a sure thing. It just seems like it might be possible. Though, it may turn out to be the case that you need, basically, millions of long-exposure shots to get a sharp image in the end. If the idea breaks, that's probably where it will do so -- you'll just run out of time before you have enough shots to where you can average away the noise, and subtract away the parent star's light. On the other hand, if you can do it using just 1,000 long-exposure shots... well, then that's doable; it just will take 3 years to collect enough shots.
Addendum: Thinking about it further, it might even be possible to image the surface of planets with atmosphere, provided that:
1. It's possible to see the ground when not obscured by clouds.
2. The features on the ground don't move around very much. Certainly large features like mountains or roads don't change much over time.
3. The cloud-dynamics are such that you can average them away if you have enough exposures.
But, certainly, it would be a lot easier to image sterile moons and planets, without atmospheres.
Posted 21 August 2019 - 05:36 AM
We should build in space huge space telescopes and turn moon craters into telescopes.
Posted 21 August 2019 - 08:43 PM
Finding exoplanets is great fun... but it's getting a little old. So, what's next? Perhaps imaging the surfaces of large moons.
You must be kidding. Exoplanets are planets of other stars. They are far, far away. With current technology, there is absolutely no way to make such pictures. The best we can get are dim one-pixel dots.
As for the "next step", this will be the atmospheric spectra of transit planets. And this is already on the verge of possible. These spectra will give scientists the important info about exoplanets atmospheres. In some cases, they'll even serve as indirect evidence of alien life (if, for example, we'll discover the oxygen-rich atmosphere of earth-size planet in habitable zone).
Posted 21 August 2019 - 10:53 PM
Read the answers.
Here's another one that's kind of more in the spirit of my post (using multiple exposures to average away noise and boost resolution; carefully aligning the images with warps and rotations; using correlations and autocorrelations; etc.):
I'm afraid it would be extremely difficult - simply the number of photons reflected off a planet surface and reaching Earth (and the telescope lens, however big) within timeframe for a solid photo is too small to create any meaningful image.
Planets are not stationary; they orbit their stars, and that means long-exposure photo will show them as a trails. Of course the telescope could be made to follow the orbital movement and we could actually achieve the planet disc image eventually. Unfortunately though, they also pivot around their axis, and that means we are not getting their surface photos, just blurred lines around the disc. Now if we were smart enough, we could take short-timed photos multiple times on the same "hour" of the planet's "day", and combining them we might get what we want - providing we somehow make out how long given planet's "day" is. But that's only for planets without or with thin atmosphere. If the planet has weather - that's the end, it's not repeatable at all.
So, there - we already do have two techniques for taking decent photos of the exoplanets. First one - send a probe there, make it take photos and return - would take thousands of years to complete. The other - construct a telescope with lens enormous enough to capture enough photons reflected off given planet within a time frame that won't blur the surface beyond recognition - would cost 100s of trillions of dollars. The James Webb Space Telescope (the world's largest space telescope) costs almost 20 billion dollars and will not be able to 'resolve' exoplanets.
EDIT: This actually could be done within somewhat more reasonable budget. You'd need a high-precision (not necessarily enormous lens size=brightness) telescope, with a sensor capable to register separate photons, not their sum over time - "record a movie" instead of just acquiring a still. The telescope would still need to follow the planet orbit, but registering the observations over long time and using autocorrelation function of the measurements it could determine the rotation period (day length) of given planet - specific features of terrain would reappear at specific places in regular intervals (one day apart) creating a cyclic function in the general noise. Knowing the "day length" and precise time of each photon, you could remap all your measured points onto their right locations of rotating sphere over time, and that way recreate the image of the whole surface - similarly how a modern photo camera uses its movement path recorded by accelerometers to recreate a static image from a long-exposure photo shot from shaky hand.
Of course this still requires telescopes better than anything we have, but it's well within reach of our contemporary technology and not on excessive budget.
I agree with that. I don't see why it can't work. Maybe somebody who knows better can explain it to me.
And about the Rayleigh Limit, see this article:
The magnifying power of light microscopy exerts a universal fascination that lasts a lifetime. Soon enough, however, users of optical microscopes learn (or should learn) that this power comes with limitations due to diffraction, as explained by Ernst Abbe (1) more than a century ago: any object, no matter how small, will be imaged by a conventional optical system as a finite-sized spot, with a minimum dimension obtained for point-like objects (such as single molecules) approximately equal to the wavelength of light, λ, multiplied by the optical magnification, M, and divided by the numerical aperture (N.A.) (Fig. 1 A). The radius of this so-called point-spread function (PSF) can be used as a convenient criterion to define an upper limit to the minimum distance below which two nearby objects in the object plane cannot be distinguished (Fig. 1 B). It has been known for some time that this Rayleigh criterion (2) is a bit too conservative, and that objects significantly closer can still be resolved with careful image analysis or clever illumination and detection schemes. In an article published in a recent issue of PNAS, Ram et al. (3) revisited this question and demonstrated that there is really no limit to how close two identical point-like objects can be and still have distances measurable with almost arbitrary precision by using conventional microscopy.
If you are just taking a single, short-exposure snapshot, then the Rayleigh Limit (due to the diffraction limit) is going to limit your ability to resolve features. But if you are allowed to make a long movie of stationary objects, you should be able to boost the resolution beyond that limit -- and presumably that is what this PNAS article sets out to do.
I am not an expert in optics or in telescopes, but my mathematical intuition tells me all of this should be possible somehow -- if you have enough exposures or a long enough movie.
Posted 22 August 2019 - 11:05 AM
I am not an expert in optics or in telescopes, but my mathematical intuition tells me all of this should be possible somehow -- if you have enough exposures or a long enough movie.
Neither I am. I just know that, with current technology, astronomers can not even define the diameters of transneptunian objects, observed as dots or tiny blurred spots. And you're talking about surface of exoplanets! This may be theoretically possible, but practically it will take something enormous (say, space telescopes with kilometers-sized lenses). Alas, I will not see them during my lifetime.
Image (one of the best!) of Pluto and Charon (Solar System, only 40 a.u. from us) made by Hubble Space Telescope:
Posted 22 August 2019 - 11:42 AM
The Hubble isn't made for imaging planets like that. It's aperture is only 2.4 meters, according to wikipedia:
That's sufficient if you want to image objects the size of a galaxy from far away; but not good enough for Pluto.
I'm actually not sure why people haven't taken the effort to make better ground-based images of pluto, using adaptive optics and large aperture telescopes; or, combining together the information from multiple different shots, at different locations. Maybe they don't see it as worth the effort, and prefer to spend their time trying to image more distant -- and larger -- objects.
Similarly, exoplanet hunting is something that could have been done many decades ago. It doesn't require super-advanced technology. Nobody bothered to try, it seems -- and it wasn't a mystery or required a leap of genius. Somebody had to actually do the work.
Take a look at this image:
It's the image of a black hole 55 million light years away from earth. The width of the black hole is a small multiple of the size of our solar system; you can put our solar system inside it, but not 10x the size of our solar system.
If you can do that at 55 million light years away, you should be able to image objects 1/(5 million) the size at a distance of 10 light years. And the smallest feature size will be probably a factor 10x smaller, still.
- Yuli Ban likes this
Posted 22 August 2019 - 02:48 PM
TL:DR Surely we can expect our space telescope game to improve a lot over the next 10-20 years, which might bring effective imaging of exo planets within reach? Even if its not currently possible.
Digitl Camera tech has improved massively in the lasts decade in cost, performance and size/weight all of which are good signs for any space telescopes we might be launching in 10 years.
With cost to orbit coming down, and the success of the James Webb telescope (we hope!) we might see more space telescopes launched in future. Visible light telescope tend to generate more interest and enthusiasm than X-Ray/Gamma Ray telescopes.
Also wouldn't an array of multiple space telescopes have a stupid resolution potentially? Especially if we stick them in solar orbits?
Also starspawn0 have you seen that slowmo video of light moving through a bottle at (effectively) a trillion frames per second?
- starspawn0 likes this
Posted 22 August 2019 - 03:46 PM
I think there's a lot more one can do just using lots of exposures. I could be missing something really obvious, but this is my intuition (and maybe this is the idea behind the PNAS paper above):
Let's say you want to image a grid of "emitters" from a great distance. Say the grid is a 1,000 x 1,000 of point light sources; and say your camera is 1,000 x 1,000 pixels; and say you know the distance from the emitters to your camera. Now, when the light goes from emitters --> camera, it gets smeared around -- fed into an appropriate "blurring function". This blurring function you should also know how to compute, so we assume it's known, too, and is invertible.
So, we have 1,000*1,000 = 1 million degrees of freedom among all the points in the image grid; and we have 1,000*1,000 = 1 million degrees of freedom of pixel values. The former we don't know; the latter we do; and, so by inverting the blurring function, we can recover the point light sources.
What's the problem?
The problem is that you don't know those pixel values exactly (and, the actual blurring function may have discontinuities and not be perfectly invertible -- but probably this isn't a problem to overcome, in practice). There is some little bit of noise in each and every one of them. And so, all you can really do is invert a multi-dimensional sphere (or maybe ellipsoid) of pixel values when you invert the blurring function, where the radius of the sphere in each direction corresponds to the uncertainty in your estimate for the true light intensity at each pixel; and this may lead to all sorts of grid patterns.
Using priors about what kinds of grid patterns are allowed (e.g. ones that look like solar systems), you can reduce the number of possibilities greatly; but you can't reduce it enough to get surface features of exoplanets.
Are there any other options?
How about reduce the noise? And how would we do that? Well, if we can increase the contrast at each pixel, then we can do it. And one way to increase the contrast is to use longer exposures and/or use multiple exposures.
A skeptic will say, "But... the Rayleigh Principle!" Well, yes, we all have heard of the infamous Rayleigh Principle -- using optics to resolve features, taking into account the diffraction limit. However, what I'm saying here is a different beast. It's about using mathematics -- instead of lenses -- to deblur. I think the limitations really have more to do with the amount of information (# of pixels) and the contrast, rather than problems with the diffraction limit -- but I could be wrong.
Again, it's possible I'm missing something really obvious; but that PNAS paper above makes me a little more confident in my intuition.
- Alislaws likes this
Posted 23 August 2019 - 09:55 PM
Surface Imaging of Proxima b and Other Exoplanets: Topography, Biosignatures, and Artificial Mega-Structures
Seeing oceans, continents, quasi-static weather, and other surface features on exoplanets may allow us to detect and characterize life outside the solar system. The Proxima b planet resides within the stellar habitable zone allowing for liquid water on its surface, and it may be Earth-like. However, even the largest planned telescopes will not be able to resolve its surface features directly. Here, we demonstrate an inversion technique to image indirectly exoplanet surfaces using observed unresolved reflected light variations over the course of the exoplanets orbital and axial rotation: ExoPlanet Surface Imaging (EPSI). We show that the reflected light curve contains enough information to detect both longitudinal and latitudinal structures and to map exoplanet surface features. We demonstrate this using examples of Solar system planets and moons as well as simulated planets with Earth-like life and artificial megastructures. We also describe how it is possible to infer the planet and orbit geometry from light curves. In particular, we show how albedo maps of Proxima b can be successfully reconstructed for tidally locked, resonance, and unlocked axial and orbital rotation. Such albedo maps obtained in different wavelength passbands can provide "photographic" views of distant exoplanets. We estimate the signal-to-noise ratio necessary for successful inversions and analyse telescope and detector requirements necessary for the first surface images of Proxima b and other nearby exoplanets.
The technique seems to be sort of like what I described -- using lots and lots of exposures, and combining their information along with rotations. I could be wrong, though -- I will need to read it more deeply than just the light skim I've done so far.
In Section 3, we present simulations and inversions for exoplanets with Earth-like albedo distribution due to ocean and land topography. We show that, under some circumstances, there is sufficient information in the reflected light curve to resolve an accurate map of the exoplanet surface on the scale of subcontinents. We also discuss limitations due to data noise, weather system evolution, and seasonal variations in the cloud and surface albedo. In Section 4, we present inversions for Solar system planets and moons as analogs of exoplanets. We can recover global circulation cloud patterns on Jupiter, Neptune, and Venus, as well as surface features on the Moon, Mars, Io and Pluto.
Hence, after only one season of observations we will be able to obtain the first ”color photographs” of Proxima b. If the planet is partially cloudy, several observing seasons are needed to filter out the cloud noise and obtain more detailed surface maps. If planet is completely covered by thick clouds or its surface is completely featureless, we will be able to conclude on its bulk properties already after a few orbital periods (1–2 months).
And they claim to do this with modest-sized telescopes, with like a 20 meter aperture. Perhaps with a larger aperture they won't need to do as long a recording -- you can work with smaller apertures, apparently, but if you do so, you have to do longer observations to get the contrast high enough... or so I gather (again, need to read it more carefully). This is in line with what I had guessed above.
This paper demonstrates that for many exoplanets it is possible to map their surface or cloud structures using reflected starlight analyzed with the ExoPlanet Surface Imaging (EPSI) technique. We have shown that time resolved exoplanet photometry (on the scale of a few percent of the exoplanet rotation period) and with the SNR of as low as 20, can yield enough information to, for example, detect continents on a water-rich world. With higher SNR it may be possible to trace the outline of continents and their large-scale albedo features, such as deserts, vegetation areas, snowfields, icecaps, etc., or even artifical ”alien megastructures.” Combining EPSI with spectrally resolved data may yield information about exoplanetary subcontinentalscale biomarkers. Planetary ”noise” like clouds would limit such observations, but under some circumstances only by decreasing the surface albedo contrast. These data are within reach of the next generation of coronagraphic telescopes. For example a dedicated hybrid telescope-interferometer of 12–20m diameter could generate surface maps in different colors for the nearest exoplanet, Proxima b, and a few others. Dozens of Earthsize exoplanets and hundreds of larger planets could be imaged this way with a hybrid telescope of 30m or larger diameter.
Sounds like it's pretty close to being realized. I wouldn't be surprised if there are already teams working on it in secret, behind the scenes; and that in a small number of years -- maybe even this year -- we will get our first glimpse of the surface of an alien world.
I was right on the money!
- Yuli Ban likes this
Posted 02 September 2019 - 08:52 PM
Video about this paper:
Their algorithm is kind of similar to ideas in compressed sensing. The different orientations and positions of the planet relative to its star, and how it affects the reflected light, are acting like different filter settings in single pixel cameras. It gives a way of getting around the diffraction limit in imaging, by increasing the amount of information about the planet's surface you can extract; you just can't extract it in a short period of time, in a single shot.
In principle, this method should enable imaging of planets dozens or maybe even hundreds of lightyears away, and using only modest-sized telescopes (say, 30 m, not hundreds of meters). Furthermore, with longer recordings I see no limit to how crisp an image of Centauri Proxima b that is possible using just a 30 m ELF telescope. With years of precise measurements they should be able image features of diameter maybe 100 miles -- or even 10 miles.
This is even better than the idea I proposed! -- Really cool stuff!
Posted 10 September 2019 - 05:35 PM
I think we still have a few steps to take. As mentioned above I think after the "exoplanet craze" that we seem to be in now, we're going to start looking at the atmosphere and temperature of other planets and being hype about that. (Finding another confirmed Oxygen planet that's a few billion years old with a temperature of 70'F in the hospitable zone of an G Type star that has a complex solar system would certainly cause a lot of excitement)
After that (likely around early mid-century) do we get to the point of literally imaging exoplanets. That's when the next craze happens when we get our first glimpses of planets with blue and green.
Then after that is unmanned probes we send at sub-luminal speeds to visit some of the closer ones and send back data.
Like before, we got all excited when we imaged other planets and stars. Then after that we finally realized "nebulae" were other galaxies. Then VERY recently we started discovering exoplanets. I think we're continuing that trend even now.
I don't disagree that the technology is technically closer than people think we can produce it, but it's clear that science is not a huge priority for a lot of people like it used to be. I mean isn't NASA not even government funded anymore? And it's technically possible for us to get to mars or set up a lunar base, but the cost and feasibility doesn't justify it yet.
What becomes of man when the things that man can create are greater than man itself?
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users