Hugo Hacker News

1800s Astronomical Drawings vs. NASA Images (2016)

dredmorbius 2021-08-18 12:51:21 +0000 UTC [ - ]

What captivates me about the drawings is the range in detail and accuracy. Keep in mind that naked-eye observations are difficult, limited to the colour sensitivities of the human eye, and often are working at or near the limits of perception --- the fanciful details on Mars are extrapolation of a very small blurry rusty dot.

The detail of the sunspot images is exquisite.

The other aspect is that until the advent of photography, all image preservation was mediated by the human eye, mind, and artistic ability, and often the 2nd or further-removed hand accounts and relating of original events or objects. (See Albrecht Durer's rinocerous for one of my favourite examples of this.) The century or so from the mid-19th through mid-20th century where direct analogue impressions of images, sound, and movement were possible gave us a period of robustly reliable records (within the limits of equipment, and still subject to manipulation). The age of computer-modified and -generated imagery once again leaves us with high-fidelity images which may have remote, little, or no bearing on any actual reality. This includes, for what it's worth, many of the NASA images provided, which are more data interpretations than realistic representations of astronomical objects.

GuB-42 2021-08-18 23:01:53 +0000 UTC [ - ]

In fact, my biology teacher taught us the importance of drawings over photographs. It was in the mid-90s, so the point wasn't technical limitations.

That's because a good drawing will show all the important parts but none of the unnecessary details. For example the drawing of a jaw will show every detail of the teeth and bones, but not the background, supports, blemishes and minor damage that are a result of handling and not relevant to the anatomy.

We were taught a few techniques for drawing accurate representations, and after a few hours, even me and my abysmal artistic skills managed to do something decent.

All that to say that with proper training, people could do incredibly accurate representations of what they saw, most likely better than early day photographs.

sandworm101 2021-08-18 20:08:34 +0000 UTC [ - ]

>> and artistic ability

Don't forget the physical limitations of the medium. Want to paint a nebula as it fades from white to the black of space? Try to paint a gradient between two colors, a steady blend from white to black across a few inches. It is the sort of thing even the great masters never attempted because it is basically impossible.

These images were not meant to be exact representations. These are anatomy diagrams. They allow the reader to understand and recognize the objects rather than be mirror images. We have photographs, but doctors still learn from hand-drawn diagrams.

wumpus 2021-08-19 02:35:13 +0000 UTC [ - ]

You should watch the documentary Tim's Vermeer.

narag 2021-08-18 14:50:58 +0000 UTC [ - ]

I also thought that wasn't "fair" since many of the comparison images are taken with long exposure times.

I have a very old astronomy book that includes similar drawings of Jupiter and they're much more detailed and realistic.

dredmorbius 2021-08-18 15:40:35 +0000 UTC [ - ]

Where "long" may be measured in anything from hours to months.

The Hubble Ultra Deep Field image (a follow-up to the original 1995 Deep Field image of an "empty" region of sky), has a total exposure time of just under 1 million seconds (11 days), with single channel exposures of a minimum of about 1.5 days.

This somewhat exceeds the duration and light-collecting capabilities of a human observer.

https://en.wikipedia.org/wiki/Hubble_Ultra-Deep_Field

Though granted, such prolonged exposures aren't necessary for the nearer observations (many within the solar system) included in the article here.

redisman 2021-08-18 15:48:53 +0000 UTC [ - ]

Especially the Orion Nebula. I see the picture on the left and think oh yeah that’s almost exactly what I see with my scope. Then on the right it’s some crazy photoshopped (playing with channels) long exposure.

tusslewake 2021-08-18 15:12:17 +0000 UTC [ - ]

Curious, what is the book?

narag 2021-08-18 16:28:27 +0000 UTC [ - ]

"Astronomía" in Spanish, by José Comas Solá. It was a gift for me, almost 50 years ago. I think it was a little dated already.

sedan_baklazhan 2021-08-18 09:07:44 +0000 UTC [ - ]

It is surprising (at least) to see "Book of killed poets" in Russian (which is in fact some Soviet era photo paper) as NASA's Saturn image.

malkia 2021-08-18 15:54:09 +0000 UTC [ - ]

Came here to note that, and show off my poor russian language skills (bulgarian here) :)

xbmcuser 2021-08-18 11:12:46 +0000 UTC [ - ]

Most people do not realise how much light and haze pollution we have today compared to 200 years ago. And how visible the stars were. Even remote places today are not the same as the atmosphere still has more particles compared to 1800s

lanna 2021-08-18 12:11:34 +0000 UTC [ - ]

This picture shows the difference after the city of Dunedin, New Zealand, changed all its sodium lights to shielded LEDs: https://i.redd.it/hxxp2c3ksdh71.jpg

zaroth 2021-08-18 12:23:42 +0000 UTC [ - ]

The color temperature of sodium lights is so much better at night.

TeMPOraL 2021-08-18 13:50:54 +0000 UTC [ - ]

Also sodium lights don't create so many headache-inducing shadow patterns.

For example, a street I walked by regularly for almost whole my life recently (in the last few years) got its lighting replaced - each sodium lamp is now replaced with a LED array with no (or ineffective) diffusor. In other words: each spherical source of light got replaced by a bunch of point sources.

Last time I walked down that street during night hours, I got a vague feeling as if I was playing an old videogame, because both the road and the sidewalk looked like a low-resolution texture viewed up close: full of smudgy blocks that result from texture upscaling. Except those blocks moved in a weird dance, making my head spin when I focused too much on it, kind of like looking at moving Moiré patterns.

Turns out, this was the pattern of shadows thrown by leaves of a tree, when illuminated by half a dozen point light sources.

That's my only complaint, though. LED lights are a win overall.

ericbarrett 2021-08-18 14:26:21 +0000 UTC [ - ]

San Jose used sodium lamps for years in deference to Lick Observatory, which is on Mt. Hamilton about 15 miles east of the city. The biggest advantage of sodium lights over incandescent for astronomy is that the spectral lines are very distinct and easy enough to filter out when doing scientific studies; incandescent, on the other hand, floods the spectrum broadly up to visible light.

I believe LED lamps have similar properties to sodium, although I'm not sure how exact they are compared to sodium lamps—there might be greater variance in the spectra emitted due to material differences; whereas all sodium lamps are arcing through a common atomic element and have very predictable wavelengths.

sumtechguy 2021-08-18 12:59:34 +0000 UTC [ - ]

I have only had the pleasure to see the milky way once. It was in the middle of a loan stretch of road in arkansas. I have not seen it since then, 25 years ago. I am the only person in my neighborhood who turns off the porch light at night. I want to see that again and to share it with others. But alas I do not think it will happen :(

N1H1L 2021-08-18 14:44:10 +0000 UTC [ - ]

When I was a grad student at Penn State, we camped out at Cherry Springs and saw the milky way. It was humbling, and I recommend it to every person at least once in their life

prawn 2021-08-18 13:27:34 +0000 UTC [ - ]

Have you not had the chance in 25 years to go somewhere that gave you a decent view of it? Camping or a farmstay?

sumtechguy 2021-08-18 13:35:46 +0000 UTC [ - ]

overcast and raining...

prawn 2021-08-18 13:38:57 +0000 UTC [ - ]

For 25 years?

sumtechguy 2021-08-18 14:40:08 +0000 UTC [ - ]

well yeah... most of the things I like to do are in the city. So it just does not come up very often. When it did...

pkaye 2021-08-18 19:30:20 +0000 UTC [ - ]

I've seen it at a couple places. One is Lowell observatory in Flagstaff AZ. They have pretty good light conditions despite being in a city. Second is in certain national parks which tend to be away from any big city. Third was on a plane flight over the pacific ocean.

sillyquiet 2021-08-18 15:02:14 +0000 UTC [ - ]

I lived and worked in a small town in the Mojave for close to a decade. One of the few perks of living there was the dark sky - the milky way for example was eminently visible on most nights in a way I think most people never see.

https://www.darksky.org btw

rikkipitt 2021-08-18 09:03:47 +0000 UTC [ - ]

Some of the chalk drawings by Lord Rosse et al at Birr Castle in the Republic of Ireland are amazing considering they were made in the 1840’s.

https://birrcastle.com/astronomy/

https://birrcastle.com/telescope-astronomy/

mrtnmcc 2021-08-18 22:19:21 +0000 UTC [ - ]

> Total Eclipse of the Sun. I personally enjoy Trouvelot’s added artistic flair (or flare, if you want to be punny) on this one.

The sketch there is actually more accurate than the photo! The flare (corona) is real. Cameras have a hard time picking up the corona (UV filters?). Here is an example of what it looks like to the eye:

https://visitidaho.org/content/uploads/2016/12/Eclipse.png

mrtnmcc 2021-08-18 22:21:08 +0000 UTC [ - ]

Until seeing that, I never understood why people go through such effort to see it from the path of totality strip. It's a religious experience.

was_a_dev 2021-08-18 08:13:30 +0000 UTC [ - ]

I can't comprehend how some artist manages to draw something like a gas cloud so accurately.

In fact it is all impressive, maybe with the exception of Mars.

sedan_baklazhan 2021-08-18 09:16:09 +0000 UTC [ - ]

Check out Mars from a decent telescope. It's a blurry tiny yellow circle with very unclear "shadows" on it. They drew it really well. Much more impressive than Orion in fact.

darkerside 2021-08-18 12:09:52 +0000 UTC [ - ]

Why is Orion unimpressive? Because the author didn't include visible representations of wavelengths that are outside of the range of human detectability?

transportguy 2021-08-18 14:05:15 +0000 UTC [ - ]

the drawing of orion is unimpressive, as it is easily visible and drawable to a similar quality with any telescope or even binoculars if you have clear skies. Spotting any sort of features on mars requires huge magnification incredible steadiness and an ability to track the object. Drawing the details for Mars required magnitudes more ability and far better technology.

perl4ever 2021-08-18 22:31:37 +0000 UTC [ - ]

For reference, Orion is about 3,600 arcseconds wide [http://spider.seds.org/ngc/revngcic.cgi?NGC1976], whereas Mars is from 4 to 25 [https://web.archive.org/web/20100612092806/http://nssdc.gsfc...] depending on its distance.

_Microft 2021-08-18 08:41:33 +0000 UTC [ - ]

The Orion Nebula [0] is appearing relatively large on the sky, about 1° (= 60 arc minutes) wide. This is approximately twice as wide as a full moon appears on the sky. With a telescope, you should easily be able to make out details. It's the white-ish blob in the "vertical" chain of stars in the lower center of this image [1]. The yellow-orange star in the top left is Betelgeuse by the way. It was in the news a while ago because it had dimmed a lot and some people (not astronomers though) were hoping for a supernova. That's a different one from "Tabby's Star" which was also in the news because of unexpected dimming.

Drawing comes with practice by the way.

[0] https://en.wikipedia.org/wiki/Orion_Nebula

[1] https://upload.wikimedia.org/wikipedia/commons/5/5b/Orion_co...

derbOac 2021-08-18 12:41:37 +0000 UTC [ - ]

Certain elements of the Mars drawing having me looking for an explanation. Maybe it's just artistic license, but other features are so similar to contemporary photos that it makes me think there's a more practical explanation.

bjarneh 2021-08-18 08:20:37 +0000 UTC [ - ]

The NASA images of space are colored black and white images, if I'm not mistaken. So artists are involved in NASA's images as well it seems.

dredmorbius 2021-08-18 08:58:29 +0000 UTC [ - ]

Most scientific astronomical images are based on capture of specific bandwidths and frequencies, which are then assigned colours for visual interpretation.

Given that many of the frequencies are beyond the limits of human vision (ifra-red, microwave, and radio, at the low end, ultraviolet, x-ray, and gamma-ray at the high), this is somewhat out of necessity.

The individual channels largely record intensity, so in that regard they're similar to B&W photographs, but the intensities are of specific bands, rather than a wide range of bands as in silver-halide based photographs. There may be some frequency variation recorded and interpreted as well, I'm not certain of this.

That said, I'm pretty sure that there's also an eye to aesthetic and public appeal of landmark released images.

This link shows and discusses multi-spectral images, showing the channels individually for several objects:

https://ecuip.lib.uchicago.edu/multiwavelength-astronomy/ast...

The Crab Nebula at frequncies from radio to gamma: https://upload.wikimedia.org/wikipedia/commons/thumb/6/6b/Cr...

Galaxy Centaurus-A image which includes infrared and x-ray channels: https://apod.nasa.gov/apod/ap210117.html

Today's APOD shows the Ring Nebula in infrared, red, and visible light: https://apod.nasa.gov/apod/ap210818.html

You can find such examples by searching for different EMR aspects, e.g., "radio, ultraviolet, x-ray". These are typically noted in APOD's image descriptions.

qayxc 2021-08-18 08:57:44 +0000 UTC [ - ]

You are correct. NASA employs artists to generate published press images.

The individual pictures sent by the spacecraft are indeed monochrome images taken with different filters. These filters don't usually correspond with RGB and calibration is required to approximate human perception.

Most of the time there's no equivalent at all (this applies to pretty much all deep space imagery) and the colours are basically made up (the intensities aren't - they correspond to frequency responses, just not necessarily within the human vision spectrum).

Here's a discussion on the topic: http://ivc.lib.rochester.edu/pretty-pictures-the-use-of-fals...

TeMPOraL 2021-08-18 14:25:56 +0000 UTC [ - ]

I'm still conflicted. The article you linked goes deep into philosophy of knowledge, but still arguably misses the most important point - false color can be a tool for analysis, pretty pictures are entertaining (and/or good marketing), but neither of them let us - individuals - get closer to the phenomena.

When I look at photos of distant places on Earth, I do so because I'm trying to imagine how would they look like if I was actually there. This is the main reason people care about photos in general[0] - they're means to capture and share an experience. As a non-astronomer, when I'm looking for photos of things in space, I also want to know - first and foremost - how would these things look to my own eyes if I was close enough to see them.

The article mentions NASA defending their image manipulation as popular astronomy's equivalent of red-eye removal. But it's not that. Red-eye removal exists to correct for the difference between a flash-equipped camera system and human eyes. It's meant to manipulate the image strictly to make it closer to what a real human would've seen on the scene. Where is astronomy's actual equivalent of that? Pictures manipulated in such a way to make them maximally close to what an astronaut in a spacesuit would see, if they were hanging around the astronomical object in question? That's what I'd like to see.

Such pictures will not be as exciting as the propped up marketing shoots, but they'll at least allow the viewers to calibrate their understanding of space with reality. I would think this would be seen as more important in the truth-seeking endeavor of science.

--

[0] - Marketing notwithstanding - the reason imagery is so useful in advertising is because of this desire; carefully retouched and recomposed pictures are superstimuli, the way sugar is for our sense of taste.

qayxc 2021-08-18 16:09:46 +0000 UTC [ - ]

> [...] what an astronaut in a spacesuit would see, if they were hanging around the astronomical object in question? That's what I'd like to see.

Well, sorry to disappoint you there but that's technically impossible. For most deep space images human eyes would likely see nothing at all (because we cannot perceive the frequency bands depicted). In other cases it's hard to tell because light conditions are vastly different from our daily perception (lack of atmosphere, harsh contrasts, very little sunlight) and the sensors collected photons for hours - something that human eyes just can't do.

One problem is that the equipment used is simply incapable of recording images as human eyes would see them - try to snap a picture of the evening sky with a smartphone camera to see this first hand (unless your newfangled device sports AI image enhancement, in which case it's just as fake [0]).

> I would think this would be seen as more important in the truth-seeking endeavor of science.

Most scientists see this "truth" as something objective though, and that's strictly not what human perception is in the first place. [0] has a RAW image of a smartphone next to an AI's interpretation and that's a good analogy to what happens with scientific data as well. Neither of the two versions matches what a person would see, yet we accept the AI version as being "good", even though it neither is depicting what the camera sensor picked up nor showing what a person would have seen.

So what is "the truth" in this case? Is it the grainy, desaturated and dark raw data from the CCD sensors or the artificial construct that tries to mimic what a human observer might have perceived?

[0] https://www.washingtonpost.com/technology/2018/11/14/your-sm...

perl4ever 2021-08-18 22:37:29 +0000 UTC [ - ]

>For most deep space images human eyes would likely see nothing at all

Depending on the definition of "deep".

Presumably in close orbit around a planet, a person would see something. In fact, I've read that even on Pluto, sunlight at noon is still much brighter than moonlight on Earth.

And considering how far away Orion is, what if someone was ten times closer? Is it unthinkable that it might be comparable to the Milky Way?

Cogito 2021-08-19 13:37:51 +0000 UTC [ - ]

An aspect of this I find fascinating is the different ways photos from Mars are coloured.

The cameras they have sent are quite good and can take 'true colour' images.

From the raw images they can be coloured as if a human was standing on Mars, but more often are coloured as if that part of Mars was on Earth. The reason for this is so that scientists can use their Earth-adjusted eyes to study the images - you want minerals in the photo to look familiar to the Earth scientist.

freemint 2021-08-18 18:05:08 +0000 UTC [ - ]

> Where is astronomy's actual equivalent of that?

Removing StarLink satellites? ducks

aero-glide2 2021-08-18 08:26:05 +0000 UTC [ - ]

Nah, many space probes have colour camera too. Here are some images from ISRO's Mars Orbiter : https://www.isro.gov.in/pslv-c25-mars-orbiter-mission/pictur...

qayxc 2021-08-18 08:52:27 +0000 UTC [ - ]

These aren't colour images - they're all coloured after the fact.

What the sensors do is to take images in multiple wavelengths. Each individual channel is still monochrome and "real colour"-images are produced by interpreting them as red, green and blue with various weights depending on calibration.

If you look at the actual sensor wavelengths, you'll notice that they don't cover the same frequency spectrum as RGB sensors or human eyes. It's all interpretation and calibration (that's why the Mars rover have a colour calibration plate on them).

Another method (often used with astrophotography) is using a single monochrome sensor and put RGB filters in front of it, take separate pictures with each filter and recombine them (done on Earth).

You can see the frequency bands of the filters used in MERs Spirit and Opportunity here: https://mars.nasa.gov/mer/gallery/edr_filename_key.html

(the rovers used 16 different filters, so no RGB there)

_Microft 2021-08-18 09:12:54 +0000 UTC [ - ]

A smartphone camera, DSLR or other camera does not work any differently though. It's just that an array of color filters is fixed above the sensor's pixels which would also be monochrome if it were not there. The readout of these pixels is automatically converted to a color image then, or stored as RAW for later processing.

This color filter array is called a "Bayer filter".

https://en.wikipedia.org/wiki/Bayer_filter

qayxc 2021-08-18 09:33:11 +0000 UTC [ - ]

I'd say there's still a difference in that while DSLRs and smartphone sensors have individual readouts for each colour channel, scientific instruments do not. So a DSLR or smartphone camera still outputs a multichannel image with a single sensor while scientific instruments are strictly monochrome.

They don't take multi-channel pictures in a single shot and each full image represents a single channel. HST for example has three cameras - one per channel - while the MER cameras have a single sensor with multiple filters.

_Microft 2021-08-18 13:15:29 +0000 UTC [ - ]

From what I know sensors do not have different readouts for the different color channels. All pixels are read out the same way and, with knowledge of the pattern of the Bayer filter in use, are then processed ("de-mosaiced") into a color image.

If one removed the Bayer filter from a sensor (as difficult as that is) and skipped demosaicing while processing the image, one would end up with a monochrome image just fine. Using physical color filters for different channels and taking the same scene for each of them, one could create a single color image with even better resolution than the camera normally would. Why better? - this might disappoint you now! - consumer cameras count each differently colored 'sub'pixel as a full pixel. So half of the "megapixels" of your camera are e.g. green and just a quarter of them each red and blue (that's a common ratio for the colors). The camera then 'fakes' having a full set of RGB pixels by demosaicing, see https://en.wikipedia.org/wiki/Demosaicing ). Taking monochrome image with physical color filters would allow to use the full resolution of the camera for each color channel.

qayxc 2021-08-18 16:20:06 +0000 UTC [ - ]

That's exactly what I wrote, though.

> From what I know sensors do not have different readouts for the different color channels.

You know wrong then. HST uses three camera sensors precisely to have three different filters at the same time and at full resolution: https://www.stsci.edu/hst/instrumentation/legacy/wfpc2

So yes, the WF/PC2 does indeed have different readouts per channels because each channel has its own dedicated sensor.

Maybe you simply misinterpreted what I wrote. The fact of the matter is that consumer level sensors return multi-channel images (the sub-pixel filtering is an irrelevant technical detail), while scientific instruments use dedicated sensors per channel.

edit: just to clarify even further - there are no subpixel shenanigans with scientific instruments for both higher precision and to avoid cross-talk between frequency bands; hence separate sensors per frequency filter. if you want multiple channels at once (e.g. space telescopes), you therefore need multiple sensors as opposed subpixel filtering.

_Microft 2021-08-18 17:42:23 +0000 UTC [ - ]

OK, I think I understand the confusion now. I understood "readout" as hardware implementation detail (think: "readout circuit") while it seems to refer to the data of a particular channel, right? My lack of familiarity with the lingo seems to blame then.

That "subpixel shenanigans" on scientific apparatuses are inacceptable should be understood.

tigershark 2021-08-18 14:28:20 +0000 UTC [ - ]

What you are saying doesn’t make much sense. It’s obvious that the images are taken on different wave lengths and recombined. Cameras with a Bayer filter use exactly the same principle but have worse spatial resolution because they have micro-filters on each pixel and the result is interpolated. Foveon sensors work in the same way with different layers sensible to a specific wave length. Even our eyes work in the same way with the rods and cones that are sensible to luminance and different wavelengths. By your definition what we see are not colour images since they are processed after the fact by our brain from the separate wave length signals coming from the receptors in our eyes.

qayxc 2021-08-18 16:57:06 +0000 UTC [ - ]

You misinterpreted what I wrote then.

Yes, a consumer-level camera sensor uses Bayer filtering, but that's just an implementation detail just like the rods and cones inside the retina - you don't perceive each channel individually after all, lest you wouldn't be able to perceive "brown" or "pink".

So any analogy between CCD sensor output and human eyes ends right there.

The difference lies in the output of the sensory mechanism (be that human eyes or CCD sensors). While consumer level hardware outputs multichannel images (yes, by means of subpixels and Bayer filtering, but that's irrelevant - for the sake of argument it could as well be pixies), scientific instruments use separate CCD sensors for each recorded channel to a) keep the full resolution and b) avoid cross-talk between different frequency bands , which is unavoidable with the subpixel setup used in commodity hardware.

So while your smartphone or DSLR's RAW output will be a multichannel image, instruments onboard of spacecraft will only ever output a single channel per sensor. No subpixels, no Bayer filters. Either one dedicated sensor per recorded channel (e.g. HST WF/PC2 camera) or multiple images taken in series using a single sensor but multiple filters (e.g. cameras onboard MER and MSL rovers).

I hope that was a little clearer.

fulafel 2021-08-18 17:02:18 +0000 UTC [ - ]

I'd argue if you have sensors that take images in multiple wavelengths, and they are combined, it amounts to a imaging system (=camera) that senses colors - vs upthread argument of "colored black and white images".

qayxc 2021-08-18 17:20:58 +0000 UTC [ - ]

The issue is that the multi-wavelength combined image doesn't correspond to RGB like the RAW output of DSLR or smartphone camera.

The different bands received from spacecraft, rovers and (space-)telescopes are matched with red, green and blue, even though they're actually taken in near infrared, UV, or even X-ray spectra.

The "truth" is that the images are only monochrome representations of recorded photons of different energies. Most people don't see near UV light for example and some images are taken in very narrow bands and then "stretched" into "full" RGB, etc.

The point is that the recorded data doesn't correspond to the human vision apparatus (or at least not well) an thus any multi-colour representations are necessarily fabricated and sometimes don't even try to reflect what a human would have seen.

tigershark 2021-08-18 17:28:29 +0000 UTC [ - ]

The Foveon sensor that I mentioned before works exactly in the same way getting 3 full resolution images at different wave lengths, exactly like the scientific cameras. And I’d argue that the only difference between this approach and the eye/Bayer sensor is the resolution, the underlying mechanism is exactly the same.

qayxc 2021-08-18 22:19:19 +0000 UTC [ - ]

But does the Foveon sensor use RGB filters or does it work with different frequency bands? That's a major issue with science data - it doesn't contain RGB data for the most part, because that's not what scientists are interested in.

What's the correct colour channel to use for for near UV (yes, I know that some people - mainly women - can perceive that, but the vast majority of people can't)? What matches near IR or even X-rays? How would you colourise the frequency bands that match different elements (those include very narrow bandpass filters) but don't really make sense in terms of RGB?

That's the difference right there. If you have a picture that uses channels of different filter bands that correspond to hydrogen bands, organic molecules or UV radiation, none of that data correlates to human colour perception. So someone just chose shades of green for hydrogen or shades of blue for organics.

So what I'm trying to say is that the underlying mechanism is irrelevant - the data simply isn't colour data as seen by humans.

solarized 2021-08-18 09:39:26 +0000 UTC [ - ]

When the sky still crystal clear to see. The reason why almost polymath at that era also mastering astronomy.

wumms 2021-08-18 08:31:22 +0000 UTC [ - ]

What happened to NASA's Saturn?

sirfz 2021-08-18 08:37:31 +0000 UTC [ - ]

Click on the link under the picture to view the intended photo. I'm still interested how this error happened tho

red_trumpet 2021-08-18 08:40:21 +0000 UTC [ - ]

Also it's the same as the teaser photo at the beginning of the article.

Edit: Could also be that the photos on the server changed. After all the article is 5 years old.

sedan_baklazhan 2021-08-18 09:35:04 +0000 UTC [ - ]

It became a vintage Soviet photopaper with a weird name.

mulmen 2021-08-19 00:10:42 +0000 UTC [ - ]

I was recently at The Maritime Museum in Astoria, OR. They had a selection of enlightenment era maps of the world and North America specifically. Most of them were indecipherable.

queuebert 2021-08-18 22:34:38 +0000 UTC [ - ]

Was rotating the NASA images to match the drawings too much to ask?

matoyce 2021-08-18 14:29:56 +0000 UTC [ - ]

The 1800s astronomical drawings are very detailed and accurate. I do wonder how beautiful are the night skies in the last 100-200 years. Today's light pollution made it hard for us to appreciate the beauty of the night sky.

spitfire 2021-08-18 16:11:18 +0000 UTC [ - ]

These remind me of the group of seven artist Lawren Harris. He painted Glaciers and nature landscapes in a sort of art-deco style. The colour absolutely glows from his paintings.

I get that same feeling here, and I love it.

blodkorv 2021-08-18 10:34:50 +0000 UTC [ - ]

I wonder what our pictures of space will be seen as in the future.

holoduke 2021-08-18 11:25:21 +0000 UTC [ - ]

Empty. Space is expanding. All matter around us will vanish beyond the event horizon. Will take some trillion years through

perl4ever 2021-08-18 22:46:57 +0000 UTC [ - ]

https://en.wikipedia.org/wiki/Big_Rip

"If the dark energy in the universe increases without limit, it could overcome all forces that hold the universe together. The key value is the equation of state parameter w, the ratio between the dark energy pressure and its energy density. If −1 < w < 0, the expansion of the universe tends to accelerate, but the dark energy tends to dissipate over time, and the Big Rip does not happen."

"According to the latest cosmological data available, the uncertainties are still too large to discriminate among the three cases w < −1, w = −1, and w > −1."

"In their paper, the authors consider a hypothetical example with w = −1.5, H0 = 70 km/s/Mpc, and Ωm = 0.3, in which case the Big Rip would happen approximately 22 billion years from the present. In this scenario, galaxies would first be separated from each other about 200 million years before the Big Rip. About 60 million years before the Big Rip, galaxies would begin to disintegrate as gravity becomes too weak to hold them together. Planetary systems like the Solar System would become gravitationally unbound about three months before the Big Rip, and planets would fly off into the rapidly expanding universe. In the last minutes, stars and planets would be torn apart, and the now-dispersed atoms would be destroyed about 10−19 seconds before the end. At the time the Big Rip occurs, even spacetime itself would be ripped apart and the scale factor would be infinity"

goohle 2021-08-18 11:44:14 +0000 UTC [ - ]

We have less than 14 billion years before we will fall into Great Attractor or Shapley Attractor, or will be erased in the process of falling.

https://en.wikipedia.org/wiki/Great_Attractor

https://en.wikipedia.org/wiki/Shapley_Attractor

dylan604 2021-08-18 13:51:16 +0000 UTC [ - ]

Only if we figure out how to get off this one solitary rock and expand beyond the solar system. The sun (Sol) will continue to go through its natural processes which will see it expand larger than the Earth's orbit. That will be the end of physical Earth, but life as we know it will have ended before then.

So, I'm not sure who the "we" will be that will have less than 14 billion years.

dcuthbertson 2021-08-18 12:38:06 +0000 UTC [ - ]

Well, I, for one, don't plan on being there when it happens!

stunt 2021-08-18 09:25:37 +0000 UTC [ - ]

Everything about early astronomical drawings is astonishing. Even when you go way back in time when they had no telescope, they were still able to map out visible stars pretty accurate.