Yes, NASA fiddled with the first color images from the Webb Telescope last week

You’ve seen the first color images from the James Webb Space Telescope, haven’t you? A stellar nursery revealing previously unseen stars, the atmosphere of a giant exoplanet examined, a group of galaxies, a beautiful planetary nebula and the deepest image of our universe ever captured.

Pretty cool, huh? But were they real?

Of course, they were real!

Were they exactly as Webb captured them in a single image, as if you were taking a picture with your phone?

Not at all.

Webb is designed to be sensitive to light that we cannot see. It also has four scientific instruments and seventeen modes.

“When you get the data, it looks nothing like a nice color image,” said Klaus Pontoppidan, Webb project scientist at STScI, who leads a team of 30 expert image manipulators. “They look almost like nothing at all [and] only if you know what to look for can you appreciate them.

Webb’s engineers had to do a lot of manipulation with the images we saw a lot before they were released, and for fairly simple, common-sense reasons.

So what’s going on?

It’s not just about taking a photo on a phone.

Image planning

First comes the selection of plans. NASA was looking for objects that would produce a nice frame, have structure, and use color, while also showcasing science.

Webb can’t see every part of the sky at one time. So, since the telescope’s launch was repeatedly delayed, there was no way for engineers to meticulously plan the first images until Webb took to the skies last December.

When he did, the engineers had a list of about 70 targets, which were selected to demonstrate the breadth of science the web was capable of, and which could herald spectacular color images.

“Once we knew when we would be able to take the data, we could go through that list and choose the highest priority targets that were visible at that time,” Pontoppidan said. “The images were planned for a long time [and] there was a lot of work to stimulate what observations would look like so that everything could be set up correctly.

How Webb’s Data Returns to Earth

Before engineers can get to work manipulating Webb’s images, the raw data must be sent back to our planet millions of miles in space. This is done using NASA JPL’s Deep Space Network (DSN), which allows engineers to communicate with and receive data from its more than 30 robotic probes in the solar system and beyond, including Webb. There are three complexes in the DSN, each placed 120° apart; California, Madrid in Spain and Canberra in Australia.

Radio waves are very reliable, but slow. Data arrives at a hefty couple of megabits per second (Mbps). However, the DSN will soon move from slow radio transmissions to ultra-fast “space lasers” that could massively increase data rates up to 10 or even 100 times faster.

“We schedule things, upload it to the observatory, take the data and bring it back to Earth, and then we have another long period of time where we process the data,” Pontoppidan said.

Why the Colors in Webb’s Photos Are Wrong

Are Webb Telescope images colorized? Are the colors of space photos real? No they are not. The Webb telescope sees in red. It’s up there specifically to detect infrared light, the faintest and most distant light in the cosmos.

He sees primarily in thermal radiation, not in visible light. He sees another part of the electromagnetic spectrum:

Think of a rainbow. At one end is red at the other end is blue or purple. This rainbow is, in reality, much wider, but the two extremes represent the limits of the colors that the human eye can perceive. Beyond blue there are shorter and shorter wavelengths of light for which we have no name. Ditto beyond the red, where the wavelength of light lengthens.

This is where Webb looks – the infrared part of the electromagnetic spectrum.

It uses masking techniques – filters – to allow it to detect weak light sources next to very bright sources. But none of this is in “color”.

So how can the photos we see be in color for us?

How Webb’s photos are colorized

Webb’s images are shifted up the electromagnetic spectrum from a part we cannot perceive into the part of visible light we can see.

They take Webb monoluminosity images using up to 29 different narrowband filters, each detecting different wavelengths of infrared light. They then assign the light collected from each filter a different visible color, from the reddest light (which has the longest wavelength) to blue (which has the shortest wavelength). They then create a composite image.

Is it cheating? All engineers do is take radiation from a part of the spectrum that our eyes cannot see and shift it to another part of the spectrum that we can see.

It’s like playing a song in a different key.

Also, all cameras, including your smartphone camera, use filters to take the images you see. No, not Instagram filters, but individual red, green and blue filters which when combined produce a viewable image that looks “real”.

If you think Webb’s pictures aren’t real, you must also think your own smartphone photos are fake.

How long does it take to process Webb images

It’s a complex process that has never been done before for Webb’s data. It therefore takes a few weeks for each image to emerge in all its colorful glory.

“Typically, the process from the telescope’s raw data to the final, clean image that communicates scientific information about the universe can take anywhere from a few weeks to a month,” said Alyssa Pagan, science visuals developer at STScI.

Surely it was worth the wait.

“In the first images, we only have a few days of observations,” Pontoppidan said. “It’s really only the beginning and we’re only scratching the surface.”

I wish you clear skies and big eyes.

Leave a Reply

Your email address will not be published.