Skip to main content

How the James Webb Space Telescope creates images of ‘invisible’ interstellar objects

The James Webb Space Telescope recently stunned the world with its first images of space, including a deep field image that showed the infrared universe in more depth than ever before.

But you can’t just point a telescope at a patch of space and snap a photo. The data collected by Webb has to be translated from the infrared and into the visible light and processed into an image before it can be shared with the public.

Recommended Videos

Processing this data into beautiful images is the job of Joe DePasquale of the Space Telescope Science Institute, who was responsible for processing some of the first James Webb images including the iconic deep field. He told us what it takes to make this incredible data come to life.

A rotating wheel of filters

To gather data on the many different types of targets that James Webb will observe, from black holes to exoplanets, its instruments need to be able to take readings at different wavelengths within the infrared. To do that, its instruments are armed with filter wheels, which are carousels of different materials which each allow different wavelengths of light to pass through.

Scientists select what instruments and what wavelengths they want to use for their observations, and the filter wheels rotate to put the corresponding element in front of the instrument’s sensors. While introducing moving parts into such a complex piece of technology is always a risk, engineers are well-practiced with working with this kind of hardware by now, as similar filter wheels are used in other space-based telescopes like the Hubble Space Telescope and the Chandra X-ray Observatory.

MIRI Filter Wheel (Qualification Model) for the James Webb Space Telescope

“It’s incredible that these spacecraft have these moving parts in them that continue to function for years and are flight-ready and radiation-hardened,” DePasquale said.

When Webb observes a target, it will look first using one filter, then another, and then more if required. For Webb’s first deep field image, it took data using six filters, each of which produces a black-and-white image. Each filter was used for a two-hour exposure, adding up to a total of 12 hours of observation time.

Once the data has been collected, it’s sent to instrument teams for preprocessing; then, it’s delivered to DePasquale. “You get six individual images, each one corresponding to the filter that it was taken with,” he said. His task is to turn those six black-and-white images into one of the stunning images of space we love to admire.

Combining black and white to make color

DePasquale will receive a varying number of images depending on how many filters the researchers have chosen, then he will combine them into a single image. By mapping data from these filters onto color channels, he creates a color image. For this work, he’ll use a combination of general-purpose graphics editing software like Adobe Photoshop and specialty astronomical software like PixInsight, which was originally developed for amateur astrophotography.

The filters can be mapped onto channels in all sorts of ways, but typically, DePasquale says he’ll map onto the red, green, and blue channels, or RGB, which are commonly used for digital images.

black and white images combined to make a color image
Images: NASA, ESA, CSA, STScI, Screenshot: Joe Depasquale Image used with permission by copyright holder

“Combining things in RGB usually creates the most natural-looking image, as that’s due to the nature of our eyes and how they perceive light,” he said. “We have the cone cells in our eyes that are responsive to red, green, and blue light. So our eyes are already primed to interpret the world that way.”

In the deep field image, he took the six filters — F090W, F150W, F200W, F277W, F356W, and F444W, which are named for the wavelength at which they observe — and combined the two shortest wavelength filters into blue, the two medium wavelength filters into green, and the two longest wavelength filters into green. These are then combined using the screen blending mode in Adobe Photoshop, which adds the layers together to make a color image.

In other images, like the Webb image of the Carina nebula, which was processed by DePasquale’s colleague Alyssa Pagan, each of the six different filters was assigned its own color to pick out all of the different features of the nebula. But that didn’t work so well for the deep field.

“I did try giving each filter its own unique color,” DePasquale said. “That can create a nice image but in the case of the deep field it really wasn’t working well. It was creating some strange color artifacts and galaxies weren’t appearing as they should. So I went with this approach, and it made a more natural-looking color image to me.”

A better-looking image

This is why image processing work requires an artistic touch as well as scientific understanding. The job of a processor is to create an image that both accurately represents the data and is visually appealing.

Once data from different filters has been combined, DePasquale works on adjusting the image’s color levels to make something attractive, but in a way that is based on astronomical principles. When it came to the Webb deep field image, he adjusted the colors based on using a particular spiral galaxy as the white reference point and a blank patch of sky as the gray background.

“When we have a deep field image or an image with a lot of galaxies in the background, my approach generally is to use face-on spiral galaxies as the white reference point for the entire image,” he explained.

“That’s because face-on spiral galaxies will display an entire population of stars, from the youngest stars to the oldest stars, representing all the colors that are possible within stars,” he said. “So we go from the bright blues of young stars to the oldish yellowish stars and everything in between. So if you use that as your white reference point that gives you a really nicely balanced image overall.”

The look of a deep field

So far, we have only two observatories capable of creating deep field images: Hubble and Webb. Hubble operates in the visible light range, while Webb operates in the infrared, but both are taking views of distant galaxies in dim parts of the sky.  It’s interesting to compare the look of deep fields from each and see how they differ.

Images from Webb will have their own unique look compared to images from other telescopes such as Hubble. This is most noticeable in the way that bright stars appear, with their distinctive eight-pointed diffraction spikes. This is due to the shape of Webb’s mirror and is inherent to images captured with the telescope.

Chris Gunn / NASA

But overall, DePasquale says he aims for a general consistency between images collected by Webb and those collected by Hubble. After all, regardless of how the data is collected, the objects being imaged are similar.

When it comes to deep field images, “that is something I’ve been working with for many years,” DePasquale said. “So I kind of have an intuitive sense of what it should look like. And I know that a face-on spiral galaxy should have a certain look to it, the distant smudges should have a certain hue to them, and everything in between should look natural.”

A philosophy of the infrared

One big difference between Webb and Hubble is that Webb is capable of looking at even more distant galaxies than Hubble, and many of these galaxies are so far away that their light takes a very long time to reach us. As the universe is expanding during this time, this light is shifted out of the visible light wavelengths and into the infrared in a process called redshift.

This brings up a conundrum: How should image processors display a galaxy which would be invisible to our eyes because of redshift, but which would be giving off visible light if it were in front of us? The Webb deep field is full of such redshifted galaxies, and even the relatively nearer main galaxy cluster in the image is redshifted as well.

“Some people will have a philosophical argument about the colors in this image, because the galaxy cluster is already four and a half billion light-years away. So it technically should be redshifted. This should be a lot more red than it looks,” DePasquale said.

The Phantom Galaxy captured by the James Webb Space Telescope.
ESA/Webb, NASA & CSA, J. Lee and the PHANGS-JWST Team

But he instead chooses to present the data in a way that mitigates the redshift and uses a wider range of colors to give more information.

“Instead of making the whole image have a red cast over it, let’s make the spiral galaxy we see in this image the white reference point, so that the cluster now becomes white instead of yellow,” he said. “And then, you get color information from everything else behind it. So the really, really distant galaxies now show up as red points in this image, and other stuff that’s closer is less red.”

The story of Webb

This approach not only helps viewers see the diversity of galaxies in the deep field but also highlights the particular abilities of Webb.

“The story with Webb is that it can see the distant, distant galaxies, whereas Hubble gets to a point it can no longer see them because they have redshifted into infrared light,” he said.

This ability to look for these high redshift galaxies is what will enable Webb to see some of the earliest galaxies which formed in the very young universe. It’s not that Webb is simply more powerful than Hubble, but rather, that they are looking at different parts of the electromagnetic spectrum.

This is complicated by the fact that Webb’s resolution changes based on the wavelength that it looks at. At longer wavelengths, its images have lower resolution. But this relationship between wavelength and resolution isn’t necessarily a bad thing for working with deep-field images.

This first image from NASA’s James Webb Space Telescope is the deepest and sharpest infrared image of the distant universe to date. Known as Webb’s First Deep Field, this image of galaxy cluster SMACS 0723 is overflowing with detail. Thousands of galaxies – including the faintest objects ever observed in the infrared – have appeared in Webb’s view for the first time. This slice of the vast universe covers a patch of sky approximately the size of a grain of sand held at arm’s length by someone on the ground.
This first image from NASA’s James Webb Space Telescope. NASA, ESA, CSA, and STScI

“It works well for the deep field image because at the longest wavelengths the galaxies that you’re detecting are really the faint ones, or the really dusty ones, and they may not have a lot of structure to begin with,” DePasquale said. “So if they’re a little less resolved, it actually looks very natural in the image.”

Scientific knowledge and creative freedom

The work of image processors like DePasquale is often the first way that members of the public engage with space science, so it’s important that it be both accurate and appealing. That requires a degree of trust between the scientists performing the research and the processors who present that work to the public.

But in his experience, he says, most scientists are delighted to see their work translated into color images. “At this point in my career, I’ve gotten to the point where I’m given creative freedom to create a beautiful image, but people trust that I know the science well enough to be able to create a beautiful color image that also tells a scientific story,” said DePasquale.

The reaction to the first James Webb images was a case in point. Not only space experts have been excited to see the potential of this new telescope; members of the public from around the world have also been amazed to see these fascinating new views of the cosmos.

This is just the beginning of what we’ll see from Webb, with plenty more images from the telescope to be shared over the coming months.

DePasquale says the public reaction to the first images is everything he’d hoped for. “It’s been amazing to see. They are literally everywhere. They were displayed in Times Square, of all places. It’s been incredible.”

Topics
Georgina Torbet
Georgina has been the space writer at Digital Trends space writer for six years, covering human space exploration, planetary…
‘That’s weird’: This galaxy could help astronomers understand the earliest stars
The newly-discovered GS-NDG-9422 galaxy appears as a faint blur in this James Webb Space Telescope NIRCam (Near-Infrared Camera) image. It could help astronomers better understand galaxy evolution in the early Universe.

Astronomers using the James Webb Space Telescope have spotted a weird galaxy that originated just a billion years after the Big Bang. Its strange properties are helping researchers to piece together how early galaxies formed, and to inch closer to one of astronomy's holy grail discoveries: the very earliest stars.

The researchers used Webb's instruments to look at the light coming from the GS-NDG-9422 galaxy across different wavelengths, called a spectrum, and made some puzzling findings.

Read more
How to watch SpaceX’s Crew-9 launch to the ISS on Saturday
Crew-7's Falcon 9 rocket and Dragon spacecraft on the launchpad.

[UPDATE: SpaceX has called off Thursday's launch attempt due to an approaching storm. It's now targeting 1:17 p.m. ET on Saturday, September 28.]

SpaceX and NASA are gearing up for the Crew-9 launch that will carry an American astronaut and a Russian cosmonaut to the International Space Station (ISS) aboard a SpaceX Crew Dragon spacecraft.

Read more
James Webb image shows two galaxies in the process of colliding
This composite image of Arp 107, created with data from the James Webb Space Telescope’s NIRCam (Near-InfraRed Camera) and MIRI (Mid-InfraRed Instrument), reveals a wealth of information about the star formation taking place in these two galaxies and how they collided hundreds of million years ago. The near-infrared data, shown in white, show older stars, which shine brightly in both galaxies, as well as the tenuous gas bridge that runs between them. The vibrant background galaxies are also brightly illuminated at these wavelengths.

A new image from the James Webb Space Telescope shows one of the universe's most dramatic events: the colliding of two galaxies. The pair, known as Arp 107, are located located 465 million light-years away and have been pulled into strange shapes by the gravitational forces of the interaction, but this isn't a purely destructive process. The collision is also creating new stars as young stars are born in swirling clouds of dust and gas.

The image above is a composite, bringing together data from Webb's NIRCam (Near-InfraRed Camera) and MIRI (Mid-InfraRed Instrument). These two instruments operate in different parts of the infrared, so they can pick up on different processes. The data collected in the near-infrared range is seen in white, highlighting older stars and the band of gas running between the two galaxies. The mid-infrared data is shown in orange and red, highlighting busy regions of star formation, with bright young stars putting out large amounts of radiation.

Read more