English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

5 answers

No.

Cameras sent on space missions typically have narrow bandpass filters, and typically none of these filters exactly corresponds to the wavelengths of maximum red,green, and blue sensitivity in the human eye. So the "compromise" is to just pick three of the filter channels, and render them as red, green, and blue.

The result is a very approximate representation of what it would actually look like if you were there. A number of knowledgable commentators believe that the published results are far too red looking.

2007-01-06 07:24:56 · answer #1 · answered by Keith P 7 · 0 0

No. They have been enhanced and are redder than the planet really is. You have to understand that NASA didn't neccisarily act in ill faith when this was done though. The fact of the matter is cameras typically don't capture things in true color anyway unless you are an experienced photographer and are using a very good camera and are able to change the settings on it. This wasn't the case for Viking and this was at a time when they really weren't sure what things on Mars would really look like. The lighting on Mars is different than on Earth and something that looks one color on Earth may look a different color under different lighting, such as that found on Mars. To help quell this uncertainty, On the most recent missions, the probes were equipped with a very good camera with many filters and "Sundial color calibration wheel".

You can go here to read more about it:
http://www.marsnews.com/news/20040130-truecolor1.html

So what color is Mars? The soil can be somewhat orangeish with varying tones. The sky, much like the color of the sky here on Earth, varies depending on how many particles of dust are in the atmosphere and what time of day it is. On a calm crystal clear day on Mars, expect the sky to be..DEEP BLUE!

http://quest.arc.nasa.gov/mars/ask/atmosphere/Color_of_sky_without_dust.txt

Typically though, there's a lot of dust in the air, and the sky is a greyish white with red haze, especially near the horizons . At sunset or sunrise on this same day the sky would appear blue near the sun due to how the sunlight reflects off the dust particles.

2007-01-06 16:44:14 · answer #2 · answered by minuteblue 6 · 0 0

The images that are sent back are usually black and white pictures.

On Earth, a computer takes a few images that were taken through different colour filters and (knowing which image was taken with which filter) rebuilds a color image.

Many of the filters on probes are for wavelengths (ultra-violet, infrared) that we cannot see but which can detect various types of information. When images are rebuilt from these images, they are called "false-color" images. For example, images taken through infra-red filters are recomposed using visible colors (otherwise we could not study them...): deep infrared is shown as red, middle infra-red as green and near infra-red as blue.

On pictures of Earth taken that way, healthy deciduous trees (leafs instead of needles) show up red, sick ones show up a different color.

Pictures can be recomposed with true colors. However, these are often not useful for study. Even images composed only from images taken through "normal" color filters will have something enhanced (contrast, color differences...) in order for features to be better seen.

PS:
I found this web site (1) of Pathfinder images, where they state:
"Mosaics of images obtained by the right camera through 670 nm, 530 nm, and 440 nm filters were used as red, green and blue channels."
Thus, the color in the images was rebuilt from three B&W images taken through specific filters.

In another image on the same Web site:
"Red and blue filter images have been combined to enhance brightness contrasts among several soil units." They skipped the green image in order to give more contrast between rocks (bluer) and soil (redder).

Another PS:
At Web site (2), there are pictures rebuilt from 7 filters. There are even some 3-D pictures where they take the picture form the left camera and make it red, while the picture from the right camera is shown in blue. Then, with the appropriate blue/red filters in front of your eyes, the left eye sees only the left image while your right eye sees only the right image: your brain recombines the two images to give you depth perception (a 3-D image).

2007-01-06 06:46:59 · answer #3 · answered by Raymond 7 · 1 0

They're intended to be, but it's a lot more difficult than it seems. The attached link contains a lot of interesting information.

2007-01-06 10:45:22 · answer #4 · answered by Iridflare 7 · 0 0

they are enhanced to show more detail. it's like turning up the contrast on your monitor. So no they are no truely accurate colors.

2007-01-06 06:17:07 · answer #5 · answered by David 5 · 1 0

fedest.com, questions and answers