Digit Geek
Digit Geek > Recent Articles > Science > Why NASA manipulates its images

Why NASA manipulates its images

The breathtaking photos of space released by NASA are often processed, but there are valid reasons why this is done

NASA releases a steady stream of awe-inspiring images from its various astronomical instruments. However, most of the photos are manipulated, and are very different from the RAW images captured by the telescopes. For this article, we will be focusing on a four of the telescopes and instruments by NASA, the Hubble Space Telescope, the Curiosity Rover, the Cassini mission to Saturn, and the Juno mission to Jupiter. The underlying principles though, on how the images are captured, and why they are manipulated in a particular way, apply to most of the images released by NASA.

Almost every image released by NASA involves multiple exposures. The images are captured in black and white, to get the most detail and information in the particular wavelengths. These images are then layered on to create a composite colour image. It is necessary to adjust the objects in the image if it has moved between the multiple exposures. At other times, it is necessary to create a mosaic from multiple shots. At times, these mosaics are digitally mapped on to a sphere to create an image of the planet or moon. Flyovers of terrains involve creating depth maps from the mosaics of planetary surfaces. After all this, there is colour correction done, portions of the image are touched up, and finally we get the images that we see in the newspapers and on the internet.

The astronomical instruments can view the objects they are observing in ultraviolet light, infrared light, and the visible light. Humans cannot see beyond the visible portion of the spectrum, so the false colour images are adjusted accordingly, with the ultraviolet and infrared features pushed into the visible range.

The above image is actually a composite of 25 different images, taken over a period of thirteen hours, by the Cassini spacecraft. The human eye can see between 0.4 microns to 0.7 microns wavelength. In this image the wavelengths of 2.3, 3.0 and 5.1 microns are represented by the visible blue, green and red images. All the data used in the image is from the infrared portion of the spectrum. The green portion is the day side, while the red portion is the night side. The night side is lit up by the thermal radiation produced deep within the atmosphere of the gas giant. At 2.3 microns the planet appears dark while the rings are bright, while at 3.0 microns, the rings are dark and the planet appears bright. This is because of the way the ice particles in the rings and methane in the Saturnian atmosphere absorb or reflect light at various wavelengths. In the composite image, the data from 2.3 microns is used for the rings, while the data from 3.0 microns is used for the day side of the planet.

The Imaging Science Subsystem on board the Cassini spacecraft had two cameras, a narrow angle camera and a wide angle camera. Each camera had a set of rotating filters, so that the object under observation could be photographed in various wavelengths. There were eighteen such filters on the cameras. This is incredibly useful for science NASA has made available all the raw images taken from the imaging instruments on board the Cassini spacecraft to the general public. By and large, when NASA releases the images, they try to replicate what the human eye would have seen, if it had been at the location instead of the spacecraft.

The Hubble Space Telescope has seen a series of upgrades to its imaging instruments. Currently, it uses the Wide Field Camera 3. The telescope can image objects under observation in the ultraviolet, visible and near infrared wavelengths. The raw images captured by the Hubble Space Telescope are also in black and white. These are layered on in three channels, to produce the final images.

This image of the Crab Nebula, is actually composed of images of the same object, captured from five different instruments. The Very Large array provided the radio data, which consists of wavelengths many kilometres in length. The radio data is represented by the colour red. The infrared data was captured by the Spitzer Space Telescope, and is represented by yellow. The visible light has been reduced to green, and this image was captured by the Hubble Space Telescope. The XMM-Newton provided the ultraviolet data, represented by blue. Finally, the Chandra X-ray Observatory captured the X-Ray data, represented by purple. When all of these come together, you have a truly spellbinding image capturing an immense amount of detail. All the stars appear blue, because most of the radiation they emit is in the ultraviolet wavelengths. This is not how the Crab Nebula, or the background stars, would appear to the human eye.

The JunoCam imaging instrument on board the Juno spacecraft is responsible for providing us with some of the best images of Jupiter. However, this is not strictly a scientific instrument. Although it has been used for scientific investigations, the primary purpose of the camera is to allow NASA to engage with the public, and get the image processing community involved. Towards this end, NASA rarely processes the images themselves, and instead have invited the general public to play around with the RAW images, and upload their final results on the JunoCam website. The raw images are made up of strips of of black and white images, captured with red, blue, green and near infrared filters. The images from the filters have to be lined up to create the final image.

The landscapes, rock formations, and even the selfies from Mars are white balanced. The image information is changed in relation to a specific pixel in the image, which is assumed to be white. This is done so that viewers can understand the formations on an alien terrain, as if they were seeing the Red Planet with lighting conditions similar to the daytime on Earth. If the humans would observe Mars for themselves, then they would see a more orangish-reddish hue, with the details appearing blurry, and the differences between differently coloured rocks being less distinct. Even these images incorporate additional data and are processed, from special instruments that add to the images captured by the camera. In the raw data the colours are actually inaccurate, and the details are even more blurry.

On the left, how Mars actually looks. On the right, how the same terrain would look like in an Earth like atmosphere and lighting conditions. 

Almost every image that NASA releases to the public, has a dedicated web page with a caption for the image, and the details of the post processing done, and why the image has been processed in a particular way. However, this information does not accompany the images through all the media where they are distributed, from newspapers to the internet. In the end, the readers or viewers have little idea on what went on behind the scenes to create the stunning images.

Aditya Madanapalle

Aditya Madanapalle

An avid reader of the magazine, who ended up working at Digit after studying journalism, game design and ancient runes. When not egging on arguments in the Digit forum, can be found playing with LEGO sets meant for 9 to 14-year-olds.