Ok, here we go.
Firstly, I made a rookie error earlier in assuming simple combination of the RGB channels would be accurate. I overlooked one important fact.
Channels Normalized.
All three channels have been
Normalized. More than that even, all three channels have been amplified to their absolute peak. I don't know the
graphical term for it, the Audio term is
Hard limiting. So I'll use that.
Basically, in each of the three filter pics. The exposure has been set so the brightest part of the picture from each filter, correlates with the
absolute maximum brightness for that channel. For example the brightest part of the red channel is FF0000, green is 00FF00, and blue is 0000FF.
(Obviously they all come in as b/w pics so in each black and white plate there is a perfect range from 000000 (absolute black) to FFFFFF (absolute
white).
You can test this by opening one of the black and white plates (photoshop again sorry). Select either 000000 or FFFFFF as the working color. Then go
to the select menu and color range. Set fuzziness to zero and OK. For each extreme you will find at least a few pixels of each.
On Earth
You can test the counter to this theory with a photo taken on Earth. Choose any photo taken on earth, (A good one to try is that Autumn road kinda one
that comes with Windows XP). Open it in photoshop and set its blending options so only the blue channel is showing. Its very dark, and there are no
0000FF pixels at all. In fact there are only a few 0000AA pixels, and they are in the whiteish parts. You can try this with any picture taken on
earth. Try to avoid pictures with solid black and white in them however. Or something silly like a rainbow. White requires bright amounts of all RG
and B to show. The rainbow is self-explanatory
.
Reason
Now, the
reason for this at first is not immediately apparent. Why send the images in this form? The answer is simple.
More data.
By sending each plate of colors spread across each extreme, you gain the maximum amount of data from each plate. Once you know the calibration
information it is easy to amplify* each channel back down to its correct setting, and get the images looking as they should. If you were to send the
images at equal levels, the blue channel would be sending a lot of information but not a lot of working data would be contained. It would be a fairly
dark picture, why not have the exposure a little higher on the L6 filter images, and pick up all you possibly can from the surrounds.
It took me a long time to realise the answer had been staring me in the face, apologies for the delay. Unfortunately, this means it looks like it will
be very hard for us to recombine the images into their actual true-color appearances. At first I thought it would be as simple as checking what
changes were required to make the sundial show up correctly, then applying them to the other pics. This in retrospect (!) was a dumb idea from the
beginning, as obviously an image that shows the primary colors (and white and black to boot). Would require the full range of all channels. So the
channels will be pretty much equal.
Now, a way to test this. Is to get these images:
marsrovers.jpl.nasa.gov...
marsrovers.jpl.nasa.gov...
marsrovers.jpl.nasa.gov...
You will have to shrink the first one from 1024 to 512. The 'EFF' is a prefix for 1024x1024 and 'EDN' is for 512x512. I don't know why, thats
just the pattern I've noticed
. These are the 3 plates that make up the top of the little silver pole and corner of the sat-dish visible in the
panorama.
marsrovers.jpl.nasa.gov...
Now, this pole has white (or very close) and also areas very close to black. So in theory all 3 channels should be close to natural.
Combining them in photoshop. (In the manner mentioned before. We get:
Which is extremely close to the colors in the panorama.
Yet when we use other 3-plate series from Sol05 (which is largely the panorama). Such as these ones:
marsrovers.jpl.nasa.gov...
marsrovers.jpl.nasa.gov...
marsrovers.jpl.nasa.gov...
We get:
A completely different look. Even though they are combined in the exact same manner. This is the effect of having all channels
Hard Limited.
You can re-create this effect by choosing auto-colors in Photoshop. While this is often handy, brightening up images and so forth. It does not work
well when you are dealing with images predominantly one color, and whos brightest and darkest point is not a shade of grey.
How does NASA do it?
Well, clearly high-end, purpose made image processing software is a big part of it. It may also be possible that information about the series is
contained in the filename. Or in other data transmitted by Spirit. I will continue trying to figure out if any part of the filename correlates to the
exposure for that plate. Its a rather imposing filename though
Are we boned?
Not at all. Any picture with White and black, or bright red, green and blue in it will look almost exact when mixed evenly. What is the one thing we
know has these? The sundial.
So any photo of the sundial. (Such as the ones shown earlier in the thread). We can be fairly sure will be accurate when mixed evenly. The convenient
thing with the sundial is the fact it has mirrors on it to show the Martian sky, so with any plate-series of L4, L5 and L6 filtered plates, we can see
the true color of the Martian Sky.
For example:
We can see the sky color in the little mirrors at the edges of the sundial.
Now, the one flaw with all this is the fact that a slight and constant hue of any sort would be removed by the equalisation of all the channels. So if
anything all these pics would likely have a
slight red/orange tint.
Now, I have an insane amount of random images clogging up my computer now, all in various stages of being fiddled with. Two interesting (and relevant)
comparison ones are these:
There is a series on sol 8 (some of which swin pointed out) which looks like a test of almost all the filters at one hill. This was good news as it
allows us to compare the difference when we use an L2 filter as the red channel and when we use an L4 filter.
The results are below.
The slide on the left looks less-red than the one on the right. Obviously the channels are normalised so the colours are not true. But it is a good
visual example of the idea that the near-infrared selection of filter for the Red channel will actually give the appearance of
less red than
would using the L4 filter for the red plate.
More updates tomorrow night, need to find if its possible to get the exposure setting for each plate.
[Edited on 13-1-2004 by Kano]