It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Evidence that NASA is altering the true colours of the pictures of Mars

page: 15
43
<< 12  13  14    16  17  18 >>

log in

join
share:

posted on Dec, 15 2008 @ 12:33 AM
link   

Originally posted by Yoda411
reply to post by RFBurns
 




Seriously though can anybody think of any idea why someone would intentionally lie about these images being modified so drastically? I mean seriously RFBurns it's called the red planet for a reason. Your pictures it's blue
.


Why is our planet called blue when seeing it from space, that only the oceans are blue and the rest is brown, green, desert yellow, tropical deep green, and then white over the polar caps.

Have you seen any of the Hubble photographs of Mars?

Even those show the blue in the sky, just as you would see the blue in our sky viewing Earth from a distance, all on the outer edge of each.

Hubble image HERE.


Really Im not trying to be mean about your graphics accredit or anything like that. I get very defensive when someone accuses me of something when they dont know about the process of how NASA works the data before making the accusation.

But its all right there my friend. Even NASA themselves have published some images that clearly do show Mars is more than just red. Just search their image databases for them. And work with the raw datasets yourself and clearly see that the results you get are far from saturated red.



Cheers!!!!



posted on Dec, 15 2008 @ 12:36 AM
link   

Originally posted by RFBurns
Really Im not trying to be mean about your graphics accredit or anything like that. I get very defensive when someone accuses me of something when they dont know about the process of how NASA works the data before making the accusation.

But its all right there my friend. Even NASA themselves have published some images that clearly do show Mars is more than just red. Just search their image databases for them. And work with the raw datasets yourself and clearly see that the results you get are far from saturated red.


Well I am trying to be mean about your graphics. If your process is true, Mars has a multi-colored surface that changes the farther out you look. Come on now man, I researched the filtering process, applied them myself, and came up with a correct looking image. Yours looks so wildly incorrect that I don't know where to start.

Edit: ALL of your graphics look horribly incorrect.

... and let me just say that this seriously discredited your lack of forensics.


Originally posted by RFBurns
Ya it is pretty easy to reclaim the color on their images if there is something white in the image. Just adjust the rgb channels appropriately for white balance and you pretty much got the right color. It might not be exact down to the millionth nth decimal points that some would claim it must be at to be valid, but if its close, and represents at least a proper white color on something that is white, then thats good enough.



[edit on 12/15/08 by Yoda411]



posted on Dec, 15 2008 @ 12:45 AM
link   
reply to post by Yoda411
 


You posted an incorrectly processed image becasue first of all, you did not use the L2 layer as your background, which if you look up NASA's own steps in processing the data, you would not be getting that red image of yours that doesnt even look like NASA's red images at all! That right there PROVES your not doing it right to begin with and makes your point worthless.

So take the L2 image in its raw form and open it first. Apply the red color shading. Then you add the other available L filter images as they are numbered, ie L2 is first, which is deep red, then L3 or L4 if they are available, those are red for L3, and red-yellow for L4, then apply L5, wich is green, then apply L6, which is blue, then apply L7, which is violet.

You work them just as they are numbered, starting from 0 thru 8 as they are on the camera filter cap, and is exactly how NASA does the process. Its their own steps! Not mine or DA's or anyone elses, NASA's steps.

You dont count from 1 to 5 to 9 to 20 when counting the numbers in a row. You start from zero, then 1 then 2 then 3 then 4 and so on.

Since NASA publishes only filter data they put up on their websites, its not my fault that L3 or L4 or L6 are not there, I dont control NASA's decision to not publish those filter data sets or control NASA or JPL who controls the rovers to take a picture using those filters that are on the camera.

We work with what NASA gives us. And in my example I posted that you requested, you cannot deny that the result is not following NASA's own image layering and combining process. It follows it to the T exactly as NASA describes. Its not my fault that I, and others have noticed how so many images from them all come out red when clearly their own cameras and data says it is not.

Just keep researching it and do the image processes as NASA does it. I and others are doing it exactly as they are.



Cheers!!!!



posted on Dec, 15 2008 @ 12:47 AM
link   
Who told you that red is set as the background filter? This is taken directly from Cornell's website (an IVY league school for the uneducated).




posted on Dec, 15 2008 @ 12:48 AM
link   
reply to post by Yoda411
 


Again that shows that you dont know about basic photography 101. Its called a "white balance". And if you knew ANYTHING about photography, and about the 3 basic colors used to produce a color image, you have to have all 3 of those basic colors, RED, GREEN, and BLUE, balanced in order to get...WHITE.

Please go look this all up in google or wiki or something. Its so basic that I cant stop laughing from all your uninformed ranting.



Cheers!!!!



posted on Dec, 15 2008 @ 12:51 AM
link   

Originally posted by Yoda411
Who told you that red is set as the background filter? This is taken directly from Cornell's website (an IVY league school for the uneducated).



Which one did they start with. Its right there right in front of you!

When they combined the L4 and L5, that gave them the colored L4 in the middle of that picture. Then they took the L5 and L6 and combined them for the L6 final color...blue. Then they simply moved the L5, green, in between the L4 and L6 and laid those out in the order they were layered.

Ok if you think Im wrong, then YOU take those raw data images, and throw up L6 as your base layer, then add the others and see if that DVD comes out looking like theirs. In fact put in either L6 or L5 as your base layer and then apply the others out of sequence and see if that DVD comes out looking right.

I guarantee you it wont.




Cheers!!!!

[edit on 15-12-2008 by RFBurns]



posted on Dec, 15 2008 @ 12:54 AM
link   

Originally posted by RFBurns
reply to post by Yoda411
 


Again that shows that you dont know about basic photography 101. Its called a "white balance". And if you knew ANYTHING about photography, and about the 3 basic colors used to produce a color image, you have to have all 3 of those basic colors, RED, GREEN, and BLUE, balanced in order to get...WHITE.

Please go look this all up in google or wiki or something. Its so basic that I cant stop laughing from all your uninformed ranting.

Cheers!!!!



Did you seriously completely overlook the Cornell supplied information just to rant?



posted on Dec, 15 2008 @ 12:56 AM
link   
reply to post by Yoda411
 


Apparently you did. Your not even paying attention to how they applied the layers to begin with. Again, just do it yourself and set any of the layers other than L4 as your base layer then apply the others and lets see your results.

Time for YOU to show us what you know. Ok? Fair enough? I think it is.

Oh and pick one that doesnt have L4, pick one with L2 and jumps from there to L5 and jumps again to L7. If you dont start the base layer with L2 first, then L5 and then L7, your gonna get very sick results!




Cheers!!!!

[edit on 15-12-2008 by RFBurns]



posted on Dec, 15 2008 @ 01:08 AM
link   
reply to post by RFBurns
 


What would NASA have to benefit from altering the color of their photographs? Absolutely Nothing.



posted on Dec, 15 2008 @ 01:16 AM
link   
reply to post by Yoda411
 




What would NASA have to benefit from altering the color of their photographs? Absolutely Nothing.

Suspicion is raised when NASA showed a picture of Mars with blue sky and without oversaturated red hue at press conference while they released majority of pictures with oversaturated red hue. That's the point of this thread.

To be fair, the color pictures made by folks at Cornell University do show red sky. They supposedly created a software that uses all 6 filters to create what they call true color images.

I want to duplicate their results using the same method they use. I want to understand the method. It could make or break the case.

[edit] I actually do understand the method. I am halfway to completing my software.

marswatch.astro.cornell.edu...

[edit on 15-12-2008 by Deaf Alien]



posted on Dec, 15 2008 @ 01:21 AM
link   
reply to post by Deaf Alien
 


I don't mean to sound juvenile but what prevented them from attaching a high-resolution digital camera along with their MER camera for accuracy and to calibrate the MER camera. Without a comparison how would we ever know that the MER images are developing accurately?



posted on Dec, 15 2008 @ 01:27 AM
link   
reply to post by Yoda411
 




I don't mean to sound juvenile but what prevented them from attaching a high-resolution digital camera along with their MER camera for accuracy and to calibrate the MER camera. Without a comparison how would we ever know that the MER images are developing accurately?

Lol, that question has been raised millions of times here on ATS. Many have even sent letters to NASA asking the very same question. Their response? "Not scientific."

Hundreds of million of dollars into the program out of taxpayers' (that's us) pockets and they can't attach a stinking digital camera?



posted on Dec, 15 2008 @ 01:42 AM
link   

Originally posted by Deaf Alien
reply to post by Yoda411
 




I don't mean to sound juvenile but what prevented them from attaching a high-resolution digital camera along with their MER camera for accuracy and to calibrate the MER camera. Without a comparison how would we ever know that the MER images are developing accurately?

Lol, that question has been raised millions of times here on ATS. Many have even sent letters to NASA asking the very same question. Their response? "Not scientific."

Hundreds of million of dollars into the program out of taxpayers' (that's us) pockets and they can't attach a stinking digital camera?


"Observing" is arguably the most scientific part of an investigation.

That makes me sick to my stomach honestly. You can buy digital cameras now for $100 and less. They must have a lot of faith in this MRE system if they send only that camera that vast distance. Do geologists or any other field of science even use this type of imagery on Earth? That would make a good control. There is what 7 different filters necessary to compile for the true color images? Do they even provide all 7 to the public?

From observing the NASA images it honestly looks like they themselves have not perfected this process by any means. That could easily be an explanation for the "blue sky" image. There is not enough nitrogen, much less oxygen on the planet to produce that in reality.



posted on Dec, 15 2008 @ 01:42 AM
link   
reply to post by Deaf Alien
 


Exactly!! Considering this for a moment. NASA's average yearly budget is over 11 billion dollars, article HERE, and the missions themselves cost 15 to 20 million dollars each, and the sophistication of these cameras on the rovers, one would think that since us taxpayers are forking the bill, that we could get a regular, decent RGB digital camera or image out of the exsisting hardware that is there and very capable of getting a full visual spectrum RGB image with the appropriate filters.

But we dont get that. As DA pointed out, NASA's answer is "not scientific".

Well ok but is the majority of the taxpaying public only interested in "scientific" and "geological" results from their 15 to 20 million dollar investment?

No I dont think thats the case. For the price we pay for these missions, we should have more than just the science, which there is absolutely nothing wrong with the science at all, or having a bunch of it. What we barely get for the average person, is red saturated images and false color images and alot of hot air for our investment.

Our purpose here yoda, is to simply provide an alternative view, using NASA's own data and their own processes to produce the images as they would come out from their published data. And in the examples that I have posted, and those of others, and even some from NASA's own websites, we find that there is hard evidence that a majority of these color images that come from these raw data sets, are clearly red saturated..and for what purpose? Why make them so red saturated?

Is it to cover up something about Mars they dont want us to know about? Is it to maintain some long ago established notion that Mars is dead and nothing but rocks and hills? If that is the case, that Mars is nothing more than a dead dry red planet with rocks and hills, why spend so much of our tax dollars on rocks and hills, and equip rovers with the capability of getting color images with RGB filters and only publish red images from those datasets?

Never A Straight Answer!!

No Actual Sense Always!!

Were not trying to re-write scientific protocol or dispute NASA"s raw data. We are using NASA's published data from the start. And using NASA's processes and steps to make these images. They are as they are, stright from their raw data. Some of them I have posted I merely turned up color gain to bring out the color result more, or brightness to make ground level more present so it can be seen with the very bright skyline. Thats all. There is no delibrate altering going on here whatsoever.

The raw data is there for anyone to produce these color images. Follow the steps laid out by NASA, use their raw datasets they publish, and see that Mars does indeed have more to it than just rocks, hills and red.





Cheers!!!!



posted on Dec, 15 2008 @ 01:57 AM
link   
I should also point out, that they could have used RGB filters that cover more range for each filter to expand the visual capabilty of the on board camera. They used very narrow RGB filters. Why?

And in most cases, on their datasets, they did not always use the L4, L5 and L6 RGB filters! On most they would start with L2, then jump to L5, then jump again to L7! What happend to L3, L4, and L6 in those datasets?!

Why didnt they use those filters? Why did they put such narrow bandwidth RGB filters on the filter cap? Putting wider bandwidth RGB filters on that filter cap would not have taken up any more space on the filter cap, nor would it have raised the price of the hardware by so much that it would make putting wide bandwidth RGB filters out of the mission budget, and it certianly would not have interfered with the science aspect of the missions at all! They dont have to go through some rigorus routine to take a picture, be it for RGB or the geology. All they do is simply tell the camera to rotate that filter cap over each individual filter, the camera takes the picture, and then the filter cap rotates to the next filter. Not a complex or time consuming issue at all, considering they spent what, 3, 4 years collecting pictures!

Even in some of the geological images, they didnt use all of the geological filters! Why not? For example, the rock that looks like wood, they didnt even use ANY geological filters at all!! They did not even use the camera that has the filters!! They used the "navcam" for taking pictures of it.

But I am willing to bet, that they DID use the pan cam with the filters, including the narrow bandwidth RGB filters, and DID take a close look at that rock/wood, and they are simply witholding that data from us!!

I mean for one thing, a rover sent to Mars on a mission of geologcial importance, doesnt even get the geological data on such an anomaly at all??

I find that very very hard to accept.

Its just another one of those "No Anomaly Seen Ahaha" things out of NASA.

There is alot that can be found just by looking at their published datasets that dont make alot of sense at all considering the capability of the hardware thats up there.

Its another "No Answer So Ahaha".




Cheers!!!!



posted on Dec, 15 2008 @ 02:01 AM
link   
The L1-L7 filter concept actually seems like some sort of ancient color photography concept. One filter for each color to be combined in creating the colored picture.

I understand they justify it to supposedly identify minerals (correct me if I'm wrong). I would appreciate it if anyone can elaborate on how that is even possible using this technology.



posted on Dec, 15 2008 @ 02:20 AM
link   
reply to post by Yoda411
 


There are actually 14 different filters (if you include the clear and solar filters), not 7. Only 4 of the filters are really in what we call the visible spectrum. Use of infrared filters aids in identification of minerals, visible light is much less useful.

As to why there is no full color, HD camera on the rovers? Two words; weight and power. It was a huge challenge to keep the rovers within the weight limitation imposed on them and still come up with a useful machine. It would have been nice to have real pretty pictures but adding the camera to do it would have added weight at the expense of the truly scientific research. The rovers operate on very tight power budgets. If they don't get enough sunlight for a couple of days, they don't move or send data. It takes power to transmit the data from the cameras. Full color HD takes a lot of data and would require a lot more power.



posted on Dec, 15 2008 @ 02:22 AM
link   

Originally posted by Yoda411
The L1-L7 filter concept actually seems like some sort of ancient color photography concept. One filter for each color to be combined in creating the colored picture.

I understand they justify it to supposedly identify minerals (correct me if I'm wrong). I would appreciate it if anyone can elaborate on how that is even possible using this technology.


The L1/L2 and L7 filters, along with the R1 through R7 filters, provide a very good means to identify chemical elements and their composition within the geology of the planet by looking at them in these spectral ranges. Normal visible ranges dont give us enough information as the IR bands and UV bands can provide.

The system is like an old rotoscope system. The camera itself, both on the left and right side of the pan cam assembly, are very widebandwidth capable cameras. The filter caps, placed directly in front of each of the cameras, has filters on them. The filter cap rotates to each filter and then takes the picture. Each photo is stored within the rover computer. And when the day's image aquisition is done, the rover parks, and points its antenna to either Odyssey, or MGS when it was working, and sends the data back to Earth, where they seperate all the data aquired and work with it.

A good analogy is a typical digital camera that has nighttime vision capability or a pair of IR goggles and an IR light source. With the eye along, seeing in the dark is very difficult, even when your eyes adjust to the darkness, the information your getting from your eyes is very limited. But put on a pair of IR goggles and use an IR light source, and the entire world opens up! Far more information is now able to be seen in that darkness.

I am surprised that the rovers did not have nighttime IR illumination to be able to see things at night with the IR filters. With the way the temperature varies at night from the daytime, it would be interesting to see any geological changes from daytime to nighttime.

I think the MSL rover, scheduled to launch..or was scheduled to launch in 2009 and is now held back till 2011, has nighttime IR illumination and added IR filtering on the cameras, as well as better RGB visual filters too.

Lets hope that mission makes it to Mars!




Cheers!!!!



posted on Dec, 15 2008 @ 02:26 AM
link   

Originally posted by RFBurns

The L1/L2 and L7 filters, along with the R1 through R7 filters, provide a very good means to identify chemical elements and their composition within the geology of the planet by looking at them in these spectral ranges. Normal visible ranges dont give us enough information as the IR bands and UV bands can provide.


If the filters do such an awesome job with identification of chemical elements, than what was the need for the scoop and portable mineral test lab that was attached to the craft?

Could we not have used that weight/solar power for the HD imagery as Phage has mentioned?



posted on Dec, 15 2008 @ 02:27 AM
link   
reply to post by Phage
 


Tho true that high definition images would take a good amount of power and memory capacity in the computers, high definition capability is not needed to get real color from the exsisting hardware at all. Only the appropriate filters on the filter cap is needed, ie..wide bandwidth RGB filters instead of extremely narrow RGB filters.

We dont need to see Mars in Hi-Def to get real color. We never needed or had Hi-Def cameras to see real color 20 years ago in pictures taken with a disposable polaroid or even from an analog 8mm or vhs or beta camera did we?

No of course not.

Having wider bandwidth filters on that filter cap for RGB on Spirit or Opportunity would not have required any more power to send images or take pictures than the narrow bandwidth filters did. It might take a little more time to send them and even take a little more memory, but not so much as to render the entire operation dead on its wheels becuse the power requirement would be tremendous.




Cheers!!!!



new topics

top topics



 
43
<< 12  13  14    16  17  18 >>

log in

join