1800s Astronomical Drawings vs. NASA Images (2016)
sedan_baklazhan 2021-08-18 09:07:44 +0000 UTC [ - ]
malkia 2021-08-18 15:54:09 +0000 UTC [ - ]
xbmcuser 2021-08-18 11:12:46 +0000 UTC [ - ]
lanna 2021-08-18 12:11:34 +0000 UTC [ - ]
zaroth 2021-08-18 12:23:42 +0000 UTC [ - ]
TeMPOraL 2021-08-18 13:50:54 +0000 UTC [ - ]
For example, a street I walked by regularly for almost whole my life recently (in the last few years) got its lighting replaced - each sodium lamp is now replaced with a LED array with no (or ineffective) diffusor. In other words: each spherical source of light got replaced by a bunch of point sources.
Last time I walked down that street during night hours, I got a vague feeling as if I was playing an old videogame, because both the road and the sidewalk looked like a low-resolution texture viewed up close: full of smudgy blocks that result from texture upscaling. Except those blocks moved in a weird dance, making my head spin when I focused too much on it, kind of like looking at moving Moiré patterns.
Turns out, this was the pattern of shadows thrown by leaves of a tree, when illuminated by half a dozen point light sources.
That's my only complaint, though. LED lights are a win overall.
ericbarrett 2021-08-18 14:26:21 +0000 UTC [ - ]
I believe LED lamps have similar properties to sodium, although I'm not sure how exact they are compared to sodium lamps—there might be greater variance in the spectra emitted due to material differences; whereas all sodium lamps are arcing through a common atomic element and have very predictable wavelengths.
sumtechguy 2021-08-18 12:59:34 +0000 UTC [ - ]
N1H1L 2021-08-18 14:44:10 +0000 UTC [ - ]
prawn 2021-08-18 13:27:34 +0000 UTC [ - ]
sumtechguy 2021-08-18 13:35:46 +0000 UTC [ - ]
prawn 2021-08-18 13:38:57 +0000 UTC [ - ]
sumtechguy 2021-08-18 14:40:08 +0000 UTC [ - ]
pkaye 2021-08-18 19:30:20 +0000 UTC [ - ]
sillyquiet 2021-08-18 15:02:14 +0000 UTC [ - ]
rikkipitt 2021-08-18 09:03:47 +0000 UTC [ - ]
mrtnmcc 2021-08-18 22:19:21 +0000 UTC [ - ]
The sketch there is actually more accurate than the photo! The flare (corona) is real. Cameras have a hard time picking up the corona (UV filters?). Here is an example of what it looks like to the eye:
mrtnmcc 2021-08-18 22:21:08 +0000 UTC [ - ]
was_a_dev 2021-08-18 08:13:30 +0000 UTC [ - ]
In fact it is all impressive, maybe with the exception of Mars.
sedan_baklazhan 2021-08-18 09:16:09 +0000 UTC [ - ]
darkerside 2021-08-18 12:09:52 +0000 UTC [ - ]
transportguy 2021-08-18 14:05:15 +0000 UTC [ - ]
perl4ever 2021-08-18 22:31:37 +0000 UTC [ - ]
_Microft 2021-08-18 08:41:33 +0000 UTC [ - ]
Drawing comes with practice by the way.
[0] https://en.wikipedia.org/wiki/Orion_Nebula
[1] https://upload.wikimedia.org/wikipedia/commons/5/5b/Orion_co...
derbOac 2021-08-18 12:41:37 +0000 UTC [ - ]
bjarneh 2021-08-18 08:20:37 +0000 UTC [ - ]
dredmorbius 2021-08-18 08:58:29 +0000 UTC [ - ]
Given that many of the frequencies are beyond the limits of human vision (ifra-red, microwave, and radio, at the low end, ultraviolet, x-ray, and gamma-ray at the high), this is somewhat out of necessity.
The individual channels largely record intensity, so in that regard they're similar to B&W photographs, but the intensities are of specific bands, rather than a wide range of bands as in silver-halide based photographs. There may be some frequency variation recorded and interpreted as well, I'm not certain of this.
That said, I'm pretty sure that there's also an eye to aesthetic and public appeal of landmark released images.
This link shows and discusses multi-spectral images, showing the channels individually for several objects:
https://ecuip.lib.uchicago.edu/multiwavelength-astronomy/ast...
The Crab Nebula at frequncies from radio to gamma: https://upload.wikimedia.org/wikipedia/commons/thumb/6/6b/Cr...
Galaxy Centaurus-A image which includes infrared and x-ray channels: https://apod.nasa.gov/apod/ap210117.html
Today's APOD shows the Ring Nebula in infrared, red, and visible light: https://apod.nasa.gov/apod/ap210818.html
You can find such examples by searching for different EMR aspects, e.g., "radio, ultraviolet, x-ray". These are typically noted in APOD's image descriptions.
qayxc 2021-08-18 08:57:44 +0000 UTC [ - ]
The individual pictures sent by the spacecraft are indeed monochrome images taken with different filters. These filters don't usually correspond with RGB and calibration is required to approximate human perception.
Most of the time there's no equivalent at all (this applies to pretty much all deep space imagery) and the colours are basically made up (the intensities aren't - they correspond to frequency responses, just not necessarily within the human vision spectrum).
Here's a discussion on the topic: http://ivc.lib.rochester.edu/pretty-pictures-the-use-of-fals...
TeMPOraL 2021-08-18 14:25:56 +0000 UTC [ - ]
When I look at photos of distant places on Earth, I do so because I'm trying to imagine how would they look like if I was actually there. This is the main reason people care about photos in general[0] - they're means to capture and share an experience. As a non-astronomer, when I'm looking for photos of things in space, I also want to know - first and foremost - how would these things look to my own eyes if I was close enough to see them.
The article mentions NASA defending their image manipulation as popular astronomy's equivalent of red-eye removal. But it's not that. Red-eye removal exists to correct for the difference between a flash-equipped camera system and human eyes. It's meant to manipulate the image strictly to make it closer to what a real human would've seen on the scene. Where is astronomy's actual equivalent of that? Pictures manipulated in such a way to make them maximally close to what an astronaut in a spacesuit would see, if they were hanging around the astronomical object in question? That's what I'd like to see.
Such pictures will not be as exciting as the propped up marketing shoots, but they'll at least allow the viewers to calibrate their understanding of space with reality. I would think this would be seen as more important in the truth-seeking endeavor of science.
--
[0] - Marketing notwithstanding - the reason imagery is so useful in advertising is because of this desire; carefully retouched and recomposed pictures are superstimuli, the way sugar is for our sense of taste.
qayxc 2021-08-18 16:09:46 +0000 UTC [ - ]
Well, sorry to disappoint you there but that's technically impossible. For most deep space images human eyes would likely see nothing at all (because we cannot perceive the frequency bands depicted). In other cases it's hard to tell because light conditions are vastly different from our daily perception (lack of atmosphere, harsh contrasts, very little sunlight) and the sensors collected photons for hours - something that human eyes just can't do.
One problem is that the equipment used is simply incapable of recording images as human eyes would see them - try to snap a picture of the evening sky with a smartphone camera to see this first hand (unless your newfangled device sports AI image enhancement, in which case it's just as fake [0]).
> I would think this would be seen as more important in the truth-seeking endeavor of science.
Most scientists see this "truth" as something objective though, and that's strictly not what human perception is in the first place. [0] has a RAW image of a smartphone next to an AI's interpretation and that's a good analogy to what happens with scientific data as well. Neither of the two versions matches what a person would see, yet we accept the AI version as being "good", even though it neither is depicting what the camera sensor picked up nor showing what a person would have seen.
So what is "the truth" in this case? Is it the grainy, desaturated and dark raw data from the CCD sensors or the artificial construct that tries to mimic what a human observer might have perceived?
[0] https://www.washingtonpost.com/technology/2018/11/14/your-sm...
perl4ever 2021-08-18 22:37:29 +0000 UTC [ - ]
Depending on the definition of "deep".
Presumably in close orbit around a planet, a person would see something. In fact, I've read that even on Pluto, sunlight at noon is still much brighter than moonlight on Earth.
And considering how far away Orion is, what if someone was ten times closer? Is it unthinkable that it might be comparable to the Milky Way?
Cogito 2021-08-19 13:37:51 +0000 UTC [ - ]
The cameras they have sent are quite good and can take 'true colour' images.
From the raw images they can be coloured as if a human was standing on Mars, but more often are coloured as if that part of Mars was on Earth. The reason for this is so that scientists can use their Earth-adjusted eyes to study the images - you want minerals in the photo to look familiar to the Earth scientist.
freemint 2021-08-18 18:05:08 +0000 UTC [ - ]
Removing StarLink satellites? ducks
aero-glide2 2021-08-18 08:26:05 +0000 UTC [ - ]
qayxc 2021-08-18 08:52:27 +0000 UTC [ - ]
What the sensors do is to take images in multiple wavelengths. Each individual channel is still monochrome and "real colour"-images are produced by interpreting them as red, green and blue with various weights depending on calibration.
If you look at the actual sensor wavelengths, you'll notice that they don't cover the same frequency spectrum as RGB sensors or human eyes. It's all interpretation and calibration (that's why the Mars rover have a colour calibration plate on them).
Another method (often used with astrophotography) is using a single monochrome sensor and put RGB filters in front of it, take separate pictures with each filter and recombine them (done on Earth).
You can see the frequency bands of the filters used in MERs Spirit and Opportunity here: https://mars.nasa.gov/mer/gallery/edr_filename_key.html
(the rovers used 16 different filters, so no RGB there)
_Microft 2021-08-18 09:12:54 +0000 UTC [ - ]
This color filter array is called a "Bayer filter".
qayxc 2021-08-18 09:33:11 +0000 UTC [ - ]
They don't take multi-channel pictures in a single shot and each full image represents a single channel. HST for example has three cameras - one per channel - while the MER cameras have a single sensor with multiple filters.
_Microft 2021-08-18 13:15:29 +0000 UTC [ - ]
If one removed the Bayer filter from a sensor (as difficult as that is) and skipped demosaicing while processing the image, one would end up with a monochrome image just fine. Using physical color filters for different channels and taking the same scene for each of them, one could create a single color image with even better resolution than the camera normally would. Why better? - this might disappoint you now! - consumer cameras count each differently colored 'sub'pixel as a full pixel. So half of the "megapixels" of your camera are e.g. green and just a quarter of them each red and blue (that's a common ratio for the colors). The camera then 'fakes' having a full set of RGB pixels by demosaicing, see https://en.wikipedia.org/wiki/Demosaicing ). Taking monochrome image with physical color filters would allow to use the full resolution of the camera for each color channel.
qayxc 2021-08-18 16:20:06 +0000 UTC [ - ]
> From what I know sensors do not have different readouts for the different color channels.
You know wrong then. HST uses three camera sensors precisely to have three different filters at the same time and at full resolution: https://www.stsci.edu/hst/instrumentation/legacy/wfpc2
So yes, the WF/PC2 does indeed have different readouts per channels because each channel has its own dedicated sensor.
Maybe you simply misinterpreted what I wrote. The fact of the matter is that consumer level sensors return multi-channel images (the sub-pixel filtering is an irrelevant technical detail), while scientific instruments use dedicated sensors per channel.
edit: just to clarify even further - there are no subpixel shenanigans with scientific instruments for both higher precision and to avoid cross-talk between frequency bands; hence separate sensors per frequency filter. if you want multiple channels at once (e.g. space telescopes), you therefore need multiple sensors as opposed subpixel filtering.
_Microft 2021-08-18 17:42:23 +0000 UTC [ - ]
That "subpixel shenanigans" on scientific apparatuses are inacceptable should be understood.
tigershark 2021-08-18 14:28:20 +0000 UTC [ - ]
qayxc 2021-08-18 16:57:06 +0000 UTC [ - ]
Yes, a consumer-level camera sensor uses Bayer filtering, but that's just an implementation detail just like the rods and cones inside the retina - you don't perceive each channel individually after all, lest you wouldn't be able to perceive "brown" or "pink".
So any analogy between CCD sensor output and human eyes ends right there.
The difference lies in the output of the sensory mechanism (be that human eyes or CCD sensors). While consumer level hardware outputs multichannel images (yes, by means of subpixels and Bayer filtering, but that's irrelevant - for the sake of argument it could as well be pixies), scientific instruments use separate CCD sensors for each recorded channel to a) keep the full resolution and b) avoid cross-talk between different frequency bands , which is unavoidable with the subpixel setup used in commodity hardware.
So while your smartphone or DSLR's RAW output will be a multichannel image, instruments onboard of spacecraft will only ever output a single channel per sensor. No subpixels, no Bayer filters. Either one dedicated sensor per recorded channel (e.g. HST WF/PC2 camera) or multiple images taken in series using a single sensor but multiple filters (e.g. cameras onboard MER and MSL rovers).
I hope that was a little clearer.
fulafel 2021-08-18 17:02:18 +0000 UTC [ - ]
qayxc 2021-08-18 17:20:58 +0000 UTC [ - ]
The different bands received from spacecraft, rovers and (space-)telescopes are matched with red, green and blue, even though they're actually taken in near infrared, UV, or even X-ray spectra.
The "truth" is that the images are only monochrome representations of recorded photons of different energies. Most people don't see near UV light for example and some images are taken in very narrow bands and then "stretched" into "full" RGB, etc.
The point is that the recorded data doesn't correspond to the human vision apparatus (or at least not well) an thus any multi-colour representations are necessarily fabricated and sometimes don't even try to reflect what a human would have seen.
tigershark 2021-08-18 17:28:29 +0000 UTC [ - ]
qayxc 2021-08-18 22:19:19 +0000 UTC [ - ]
What's the correct colour channel to use for for near UV (yes, I know that some people - mainly women - can perceive that, but the vast majority of people can't)? What matches near IR or even X-rays? How would you colourise the frequency bands that match different elements (those include very narrow bandpass filters) but don't really make sense in terms of RGB?
That's the difference right there. If you have a picture that uses channels of different filter bands that correspond to hydrogen bands, organic molecules or UV radiation, none of that data correlates to human colour perception. So someone just chose shades of green for hydrogen or shades of blue for organics.
So what I'm trying to say is that the underlying mechanism is irrelevant - the data simply isn't colour data as seen by humans.
solarized 2021-08-18 09:39:26 +0000 UTC [ - ]
wumms 2021-08-18 08:31:22 +0000 UTC [ - ]
sirfz 2021-08-18 08:37:31 +0000 UTC [ - ]
red_trumpet 2021-08-18 08:40:21 +0000 UTC [ - ]
Edit: Could also be that the photos on the server changed. After all the article is 5 years old.
sedan_baklazhan 2021-08-18 09:35:04 +0000 UTC [ - ]
mulmen 2021-08-19 00:10:42 +0000 UTC [ - ]
queuebert 2021-08-18 22:34:38 +0000 UTC [ - ]
matoyce 2021-08-18 14:29:56 +0000 UTC [ - ]
spitfire 2021-08-18 16:11:18 +0000 UTC [ - ]
I get that same feeling here, and I love it.
blodkorv 2021-08-18 10:34:50 +0000 UTC [ - ]
holoduke 2021-08-18 11:25:21 +0000 UTC [ - ]
perl4ever 2021-08-18 22:46:57 +0000 UTC [ - ]
"If the dark energy in the universe increases without limit, it could overcome all forces that hold the universe together. The key value is the equation of state parameter w, the ratio between the dark energy pressure and its energy density. If −1 < w < 0, the expansion of the universe tends to accelerate, but the dark energy tends to dissipate over time, and the Big Rip does not happen."
"According to the latest cosmological data available, the uncertainties are still too large to discriminate among the three cases w < −1, w = −1, and w > −1."
"In their paper, the authors consider a hypothetical example with w = −1.5, H0 = 70 km/s/Mpc, and Ωm = 0.3, in which case the Big Rip would happen approximately 22 billion years from the present. In this scenario, galaxies would first be separated from each other about 200 million years before the Big Rip. About 60 million years before the Big Rip, galaxies would begin to disintegrate as gravity becomes too weak to hold them together. Planetary systems like the Solar System would become gravitationally unbound about three months before the Big Rip, and planets would fly off into the rapidly expanding universe. In the last minutes, stars and planets would be torn apart, and the now-dispersed atoms would be destroyed about 10−19 seconds before the end. At the time the Big Rip occurs, even spacetime itself would be ripped apart and the scale factor would be infinity"
goohle 2021-08-18 11:44:14 +0000 UTC [ - ]
dylan604 2021-08-18 13:51:16 +0000 UTC [ - ]
So, I'm not sure who the "we" will be that will have less than 14 billion years.
dcuthbertson 2021-08-18 12:38:06 +0000 UTC [ - ]
stunt 2021-08-18 09:25:37 +0000 UTC [ - ]
dredmorbius 2021-08-18 12:51:21 +0000 UTC [ - ]
The detail of the sunspot images is exquisite.
The other aspect is that until the advent of photography, all image preservation was mediated by the human eye, mind, and artistic ability, and often the 2nd or further-removed hand accounts and relating of original events or objects. (See Albrecht Durer's rinocerous for one of my favourite examples of this.) The century or so from the mid-19th through mid-20th century where direct analogue impressions of images, sound, and movement were possible gave us a period of robustly reliable records (within the limits of equipment, and still subject to manipulation). The age of computer-modified and -generated imagery once again leaves us with high-fidelity images which may have remote, little, or no bearing on any actual reality. This includes, for what it's worth, many of the NASA images provided, which are more data interpretations than realistic representations of astronomical objects.
GuB-42 2021-08-18 23:01:53 +0000 UTC [ - ]
That's because a good drawing will show all the important parts but none of the unnecessary details. For example the drawing of a jaw will show every detail of the teeth and bones, but not the background, supports, blemishes and minor damage that are a result of handling and not relevant to the anatomy.
We were taught a few techniques for drawing accurate representations, and after a few hours, even me and my abysmal artistic skills managed to do something decent.
All that to say that with proper training, people could do incredibly accurate representations of what they saw, most likely better than early day photographs.
sandworm101 2021-08-18 20:08:34 +0000 UTC [ - ]
Don't forget the physical limitations of the medium. Want to paint a nebula as it fades from white to the black of space? Try to paint a gradient between two colors, a steady blend from white to black across a few inches. It is the sort of thing even the great masters never attempted because it is basically impossible.
These images were not meant to be exact representations. These are anatomy diagrams. They allow the reader to understand and recognize the objects rather than be mirror images. We have photographs, but doctors still learn from hand-drawn diagrams.
wumpus 2021-08-19 02:35:13 +0000 UTC [ - ]
narag 2021-08-18 14:50:58 +0000 UTC [ - ]
I have a very old astronomy book that includes similar drawings of Jupiter and they're much more detailed and realistic.
dredmorbius 2021-08-18 15:40:35 +0000 UTC [ - ]
The Hubble Ultra Deep Field image (a follow-up to the original 1995 Deep Field image of an "empty" region of sky), has a total exposure time of just under 1 million seconds (11 days), with single channel exposures of a minimum of about 1.5 days.
This somewhat exceeds the duration and light-collecting capabilities of a human observer.
https://en.wikipedia.org/wiki/Hubble_Ultra-Deep_Field
Though granted, such prolonged exposures aren't necessary for the nearer observations (many within the solar system) included in the article here.
redisman 2021-08-18 15:48:53 +0000 UTC [ - ]
tusslewake 2021-08-18 15:12:17 +0000 UTC [ - ]
narag 2021-08-18 16:28:27 +0000 UTC [ - ]