It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Thanks for writing. The answer is that the color chips on the sundial have different colors in the near-infrared range of Pancam filters. For example, the blue chip is dark near 600 nm, where humans see red light, but is especially bright at 750 nm, which is used as "red" for many Pancam images. So it appears pink in RGB composites. We chose the pigments for the chips on purpose this way, so they could provide different patterns of brightnesses regardless of which filters we used. The details of the colors of the pigments are published in a paper I wrote in the December issue of the Journal of Geophysical Research (Planets), in case you want more details...
All of us tired folks on the team are really happy that so many people around the world are following the mission and sending their support and encouragement...
Thanks,
Jim Bell
Cornell U.
From howstuffworks.com
In the diagram above, the wavelengths of the three types of cones (red, green and blue) are shown. The peak absorbancy of blue-sensitive pigment is 445 nanometers, for green-sensitive pigment it is 535 nanometers, and for red-sensitive pigment it is 570 nanometers.
LEFT CAMERA..............RIGHT CAMERA
L1. EMPTY................R1. 430 (SP) *
L2. 750 (20).............R2. 750 (20)
L3. 670 (20).............R3. 800 (20)
L4. 600 (20).............R4. 860 (25)
L5. 530 (20).............R5. 900 (25)
L6. 480 (25).............R6. 930 (30)
L7. 430 (SP)*............R7. 980 (LP)*
L8. 440 Solar ND.........R8. 880 Solar ND
*SP indicates short-pass filter; LP indicates long-pass filter
Table 2.1.2-1: Pancam Multispectral Filter Set: Wavelength (and Bandpass) in nm
(4) rudimentary automatic exposure control capability to maximize the SNR of downlinked data while preventing data saturation
There's details and a figure published in that JGR paper I mentioned (Figure 20), and I attach it here for reference. On the left are spectra of the cal target materials measured in a lab at NASA/JSC. On the right are the same materials measured by Pancam in a lab at JPL before launch. We're working on compiling a version of this from Pancam measurements on Mars, but basically (thankfully!) it looks very much like the panels on the right. Look at how whopping bright that blue chip is at 750 nm, for example!
The little silver pole is the low-gain antenna. It is used for low data rate communication, a few hundred bits per second, directly with Earth using X-band frequencies (around 8 GHz). It works over a wide range of angles, and so doesn't have to be pointed like the high-gain dish antenna. We use the low-gain antenna to send commands to the rover at low rates, around 30 bits per second, when the rover is awake but not using the high-gain to send data to us at the time. The low-gain is also a backup in case the high-gain pointing isn't working for some reason. We can work through the low-gain to fix it.
Now the secret decoder ring for the image file names. Taking one example file name from your article:
2P126644567ESF0200P2095L2M1.JPG
The breakdown is:
"2" for Spirit. "1" is Opportunity. (Don't ask.)
"P" is Pancam. Other choices are N - navcam, F - front hazcam, R - rear hazcam, M - microscopic imager, and E - EDL camera.
The next nine digits are the time the image was taken in seconds since noon UTC on January 1st, 2000.
The "ESF" is the product identifier, meaning a raw sub-framed image. There are many three-letter identifiers. Some common ones: EFF - raw full frame, FFL - full frame linearized, SFL - sub-frame linearized, EDN - downsampled raw image, DNL - linearized down-sampled, ETH - raw thumbnail, THN - thumbnail linearized (doesn't quite follow the convention). Linearized means that geometric optical distortions have been corrected. There are others for various levels of processing of the images.
"0200" is site 2 and position 0. We increment those counters when driving. Position is automatically incremented for each piece of a drive. We decide when we want to declare a new "site" to help distinguish the images.
"P2095" is the identifier of the command sequence that produced the image. This makes it easy, for example, for the person who wrote the sequence to find the images that were taken by their sequence.
"L" is the left eye. It can also be R - right, B - both, M - microscopic, or N - not an image.
"2" is the filter position, in the range 0..8 where 0 is no filter or not applicable.
"M" is the product creator, in this case the MIPL automatic image processing that is part of the MER downlink system. Other choices are A - Arizona State University, C - Cornell, F - USGS at Flagstaff, J - Johannes Gutenburg University, N - NASA Ames, P - Max Planck Institute, S - science operations team at JPL, U - University of Arizona, V - visualization team at JPL, or X - other.
"1" is the version identifier.
One additional reason to add to your "Why don't they tell us this" section is that we are simply so incredibly tired and overworked trying to keep up with this great stream of data that we don't have time to stop and spell it all out.
Yes it does send back a bunch of information about the image, like the exposure time, camera pointing, temperature, etc. However as far as I know, that data is not available online. Once this data is archived in a few months, all of that data will be included and documented. All of the mission data will be available at the cost of duplication.
Ultimately you are right that people will need the calibrated data to do the color balancing correctly. We are working on doing that and will eventually get all those images out to the public using the NASA/JPL "Planetary Image Atlas" web site. It will take several months or more to get the work done, however. In the meantime, we thought it would be best to get *something* out there, and so that's why we opted to get the raw data out fast, even though it's still raw. The team has taken some criticism for this within the planetary science community because not many past missions have adopted such an open-data policy.
Let them whine, I say. People want and deserve to see the pictures as soon as we do.