It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Originally posted by mikromarius
Thanks for the explanation on the dust. Finally someone who actually answers questions instead of whining about how stupid I am. I want these questions answered, that's all. And I see now that I turned 17% into 0.17% sorry about that. But 1/6th isn't much. They still seem to fall down much quicker than they would have.
The camera registers LIGHT, not colors. The way that the camera can produce "color" images is to put a filter in front of it that filters out all frequencies EXCEPT the one that you are interested in.
And I am fully aware of that. My question is simply: How do they manage to get a fluorecent or rather phosphorecent (sp?) effect in the blue hues without using a second lightsource using UV or IR light. You don't get this effect by simply putting a color film before the camera as far as I know.
And unless they tweak or "compress" the colors back into the visible spectrum, the effects won't even be visible. You would have to scale up the invisible lightin order to see it.
Why on Earth NASA is doing this in their press pictures is quite odd in my opinion. And their arguent that every bloody blue thing on Mars is painted with some kind of superpaint in order to "calibrate" the pictures is just not good enough.
Originally posted by BarryKearns
Actually, they don't have to scale it up at all. All they do is take a signal from 750 nm (which the blackbody curve shows is still quite bright compared to visible frequencies), and when they combine it, the computer in effect "pretends" that the 750 nm signal was actually a 600 nm signal.
It's not that they've necessarily amped it up... they've just taken the curve and shifted it sideways.
This is, of course, just plain wrong to do... and especially so when they have the tools right there to see the REAL curve instead of pretending that one signal is another.
Why on Earth NASA is doing this in their press pictures is quite odd in my opinion. And their arguent that every bloody blue thing on Mars is painted with some kind of superpaint in order to "calibrate" the pictures is just not good enough.
I'm having a lot of trouble accepting that explanation as well. It makes no sense to me that they would DELIBERATELY try to make so many things high-response in the IR when the natural curves are so different.
I could almost buy it on the sundial, since that is a calibration tool, and it is useful to be able to calibrate the high-IR right-lens filters. But why use so many high-IR-response pigments elsewhere?
If anything, doing so would almost certainly result in LESS data being received when a portion of the Rover is in the scene, due to the normalization of the channels. The Rover might very well present the brightest IR signal, and thereby prevent capturing a lot of the more subtle IR data from surrounding materials.