The Secret Behind All of NASA's Gorgeous Space Photos? Photoshop, of Course
From the glossy, high-res photos, you'd think outer space is a diamond-studded smear of billowing clouds, mixing together like slick oil and rain in a swirling gutter with glass shards and flecks of concrete.
This is not what outer space actually looks like — at least least not to the naked eye. Many famous photos of star systems, nebulas and various other astral bodies are far from what you would see if you were simply standing on the surface of a distant moon looking up at Andromeda or the Pillars of Creation.
So how does NASA get such stunning visuals? The images captured by telescopes like Hubble, or even consumer DSLR cameras, are taken over and over at obscenely high quality and then manipulated. Amateur photo editors can use tools like DeepSkyStacker and Nebulosity — there's an entire suite of usable programs — but usually, the stars are heavily Photoshopped.
Take, for example, the Orion nebula.
The GIF above shows the photo-editing process: First, the photos are collected from all spectrums of light, visible and otherwise. Then, different parts of the light spectrum that would otherwise be unseen are given color assignments — red, green, cyan and blue — along with layer and curve adjustments, before they're mixed back together again.
"I basically take raw grayscale data from different parts of the infrared spectrum, and then remap them into visible colors — typically with red, green and blue Photoshop layers — to create images that are accurately representative of the infrared colors that human eyes cannot see," CalTech's Robert Hurt told Adobe. "I think of it as a visual translation process."
Here's the original compared to the final composite (on right):
When we see photos like the Ultra Deep Field or the Milky Way, we're looking at shapes and figures that aren't typically within the visible spectrum. The raw data created when the photo is taken is rich with unseen insights, and editors can pull in ultraviolet and infrared light and assign them colors like red or green to make them visible bodies.
Dozens of these layers are added, blended and balanced, one after the other, until the editor decides the photo is done. It's an arduous process that takes hours of cropping, rotating, adjusting and layering. For a simple photo of spiral galaxy NGC 3982, this process took 10 hours of editing over a three-week period:
But because there are different ways to pull slices of invisible data into the photo, each astral photographer and editor can use the data differently and use the same raw files to generate different images of a heavenly body.
"Because there is a lot of creativity, with the same set of raw data, two different people are going to come up with different things," astral photographer Rogelio Bernal Andreo told Wired about his process.
These kinds of astral photos are highly interpretive — interpretive of very real data, yes, but much like your Instagram account, they're still a creative depiction that uses artifacts of reality to create a selective vision of beauty that might not otherwise be plain to the human eye.