It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Mountains of Data for Papoose Lake Researchers (or any facility researcher, really)

page: 1
2

log in

join
share:

posted on Aug, 20 2010 @ 12:26 AM
link   
*Insert standard sorry-if-this-has-already-been-posted disclaimer here*

There is a vast resource of multispectral imagery data available online that includes imagery as recent as last week. NASA's Earth Observing-1 (EO-1) spacecraft launched on 21 November 2000 and has been providing high-resolution multispectral imagery of the Earth's surface since then. Of particular interest are certain swaths that the spacecraft images on a regular basis that include Papoose Lake and portions of Area 51 at a decent resolution.

There are two important multispectral imaging instruments on the spacecraft, the Hyperion and the Ali.

The Hyperion is a high resolution hyperspectral imager. It regularly images Papoose Lake and portions of Area 51. The imagery data available online that it captures each time is a 242-band, 16-bit-per-sample dataset at a spatial resolution of 30 meters. Each of these "snapshots" covers an area of 4.7 miles (7.5 km) by 62 miles (100 km).

So what's so great about this data? It's not that it's at a fantastic resolution. It's only 30 meters-per-pixel, so you won't be reading any license plates in the parking lot, but it's good enough that you can distinguish all of the major buildings/features at Area 51. What's great about this data is that it isn't just your standard 8-bit-per-channel Red-Green-Blue lossy-compressed jpeg image. It is a monstrous set of raw, uncompressed HDR data spanning a nice slice of the electromagnetic spectrum. Each time Hyperion images Papoose Lake, it records 16-bit data for 242 different wavelength bands from 355.59 nm to 2577.08 nm. By comparison, visible light is only from about 390 nm to about 750 nm, so the imagery dips a little into the ultraviolet spectrum and dives into the infrared spectrum, covering Near Infrared (NIR) and most of Short-Wavelength Infrared (SWIR). The resulting dataset is robust enough to do a detailed chemical analysis of the surface. In other words, you can identify not only the color of the surface/structures/etc in the imagery, but also what it is, its chemical composition. So, you could positively distinguish different types of pavement, different metals, different fabrics, different minerals in the soil, etc. For example, Google Earth's current imagery (from 16 November 2006) shows some foam or something all over the taxiway to Hangar 18 and there has been a good bit of debate on the Internet as to whether it were AFFF or something else. Well if you had Hyperion imagery from that day, you could answer that question definitively.

Before I get carried away writing about how awesome the data is, let's get down to something you might care about. Let's say, hypothetically, that there are hangar doors in the mountainside next to Papoose Lake disguised with a coating that resembled the rest of the mountainside. Well, even if you put some artificial coating on those doors that looks like the natural mountain dirt, its chemical composition, captured by Hyperion, is still going to be different from the surrounding mountainside, so if it were as simple as that, then there should be a couple pixels in that imagery that don't match the rest of the mountain.

Now, I'm not too worried about that, because even if there were hangar doors hidden there, they would just use the actual material of the mountain rather than an artificial lookalike coating, or use some other means to hide it. I'm interested in all the other invisible stuff. There are a looooot of things that are the same color as other things, but they all have their own spectrum. This imagery would show you areas where chemicals have been dumped, areas where artificial camouflaged structures may be hidden, artificial earthworks, soil brought in from other areas (with different chemical compositions), and even some areas that are generating their own heat (even subsurface) because blackbody radiation is significant above about 900 nm.

So how much data is really in one of these "snapshots"? Well a single pixel of a normal uncompressed 24-bit bitmap image has 1.7x10^7 different possible values. A single pixel of a Hyperion "snapshot" has 3.9x10^1165 different possible values (numbers that high don't have names, by the way). In normal RGB images, there are 8 bits of data per channel for each pixel, so there are 256 different possible valus for how much red is there, 256 possible green values, and 256 possible blue values. Well, instead of 3 channels, the Hyperion data has 242 channels, and instead of 256 possible values per channel, there are 65536 possible values per channel. If I haven't yet communicated effectively the precision of this data, I never will.


Anyway, another instrument on the EO-1 spacecraft is Ali, the Advanced Land Imager. It also regularly records high-resolution multispectral imagery of Papoose Lake and some of Area 51. The Ali data includes 9 separate 16-bit wavelength bands (in the same general range as Hyperion) at 30-meter-per-pixel resolution in addition to a single 16-bit monochromatic band at 10-meter-per-pixel resolution. The latter is great because it provides regular, up-to-date HDR (albeit monochromatic) imagery of Area 51 and Papoose Lake. In fact, I just looked at one of these images and saw for the first time in a satellite photo the new location of lakebed runway 03-21 that I thought I had discovered during my flyby on 14 June 2010 (my photo below, second photo is from heavily edited Ali data), but then learned had been noticed before (supposedly in satellite photography, though I've never seen the photos).

Above: My aerial photo of the new location of lakebed runway 03-21 from 14 June 2010
Below: Heavily edited (from a screenshot, no less) image showing (faint, below other lakebed runway) image of the new location of lakebed runway 03-21 from Ali imagery on 25 February 2010.


So there are mountains of data available to peruse and analyze. BUT, there are a few challenges....
1. Files are very large. One Hyperion "snapshot" of a swath that includes Papoose Lake is going to be a 250+ MB download. Uncompressed, the dataset is going to be as big as 1.5 GB.
2. Good luck with file formats. The fact is, if you're not a "computer person" then you aren't going to be able to do anything with this data--in fact, you probably won't even be able to extract it (continued below)

[edit on 20-8-2010 by shmuu]



posted on Aug, 20 2010 @ 12:26 AM
link   
(continued from above)...in fact, you might not even be able to download the data to begin with. So, this is really only an option for the kind of people who run Linux both because it's cool and because everyone else in their D&D group uses it and they like to be able to share with eachother the assembly language programs they write that rely on Linux-specific interrupts, if you catch my drift. The data will come in a .tgz file containing a .tar file. Depending on which dataset you downloaded, the .tar file will have one of the following:
A. Level 1R dataset: A .hdr file, a .L1R file, a .MET file, and a .fgdc file -- Good luck with this one. The .L1R file is the actual imagery. It will probably be around 650 MB. Allegedly, it's an HDF4 file. Like I said, good luck with that; hope you have Linux.
B. Level 1GST dataset: A .L1T file, a .fgdc file, and 242 .tif files -- The .tif files are the actual imagery. They are 16-bit monochromatic GeoTIFFs, about 9 MB each, totalling 1.2 GB. Photoshop should be able to handle 16-bit tiffs but sadly, gimp will not, though it can open them by converting them to 8-bit tiffs (kinda pointless after that). CinePaint should be able to handle them, but I couldn't realy tell since the only workable Windows build I found had issues. If you're using Linux, you probably already have 27 different nearly identical applications designed for editing HDR tiffs anyway.
C. A .fgdc file and 12 .tif files -- *B01_L1T.TIF is the 10-meter-per-pixel 16-bit monochromatic GeoTIFF. *MTL_L1T.TIF is a metadata file. The 9 other .tif files are the 30-meter-per-pixel multispectral imagery.

Anyway, I'll let you figure the rest out on your own. You can find the wavelengths for the different images online or in one of the .hdr files (ASCII).
There are multiple ways to access the imagery data. The way I have been doing it is to go to edcsns17.cr.usgs.gov... and selecting the EO-1 options on the left and then zooming the map to the area I'm interested. Then click once in the map to set one corner of the area you want to find imagery for, then click again to set the other corner. You may need to register for an account, I'm not sure, but at least you don't have to be approved or go click a link in a confirmation e-mail or anything.

Here are some direct links to some "snapshots" I've downloaded (GeoTIFF only, I believe) that I bothered to copy the URL for. They may or may not work without registering an account; I'm not sure. These all show Papoose Lake, and many show Area 51 too:
EO-1 Hyperion:
2004/01/30
2007/02/23
2007/10/05
2008/08/04
2008/08/09
2009/01/12
2009/06/15

EO-1 Ali:
2010/02/25
2010/04/02
2010/07/09
2010/08/09

There are many, many more, and new imagery pops up regularly. If you go there, you'll undoubtedly notice that there are a lot of other datasets to search. I've found some good high-resolution imagery on some of them too, both old and new. It was kinda cool to see Area 51 when there was nothing there but more dry lake.

[edit on 20-8-2010 by shmuu]



posted on Aug, 20 2010 @ 12:42 PM
link   
Suprising you haven't received any responses yet. This is great info, inspite of the large file sizes. Kudos to you OP!



posted on Aug, 20 2010 @ 05:47 PM
link   
reply to post by this_is_who_we_are
 

Well, perhaps all of the other people who are interested are still working on reading that excessively long set of posts.




Anyway, I actually did some work with this multispectral imagery. I took the Ali (Advanced Land Imager) imagery from 2 April 2010 (no particular reason for picking that date over more recent imagery other than it being a clear day and already having downloaded the data) and pulled some of the longer wavelength bands together into a false color image in hopes of distinguishing surface features with different chemical compositions and despite the fact that I don't now, nor will I probably ever believe anything this man says, the results look like they might give S-4 proponents a reason to be hopeful.


First, here's a small section of the image I created from the data. Keep in mind that this is one of countless possible false color images I could have created; it's the first thing I tried, but there are some things that can be concluded from it.

Click here or below for a larger image.




There are a couple of things notable about this image. The mountainside immediately east of the vegetation peninsula on the edge of the lakebed clearly has a different chemical composition than the surrounding terrain. This is convenient, but it's not very unusual. Thus is the nature of a mountain; they vary in composition across their surface. What I found more interesting is the little "turnaround" where the vegetation peninsula meets the edge of the lakebed. In normal imagery, it appears to be just a break in the vegetation, appearing the same as the rest of the lakebed, as seen below.

In the multispectral imagery from EO-1 Ali, however, the "turnaround" area shows a completely different chemical composition than the rest of the lakebed.

Anyway, I don't have a lot of time right this second, so I'll stop here. I will say that after looking over the big image I made (the photo I included here was just a small section of it), I saw many changes in the Goom/NTS imagery between Google Earth's current imagery and this Ali imagery, including new construction at the Project Pluto site.



posted on Aug, 21 2010 @ 02:51 AM
link   
I'm not sure you need a geotiff reader. I installed two on one of my linux boxes and the images are basically black. (Qlandkt and geotiff) I don't have a 16 bit viewer on that box (cinepaint never installed), but I read one of your gunzip files and tweaked it a bit in gimp (truncated to 8 bit), then chopped out yucca lake.
yucca lake

Cinepaint, for all the bragging, seems to have been beta for years. The closest thing to an install is to run the cvs and do the cmake. That still has bugs, but the tar files have sed scripting errors in their makefiles.

Given your description of the files, I think imagej would do the best post processing.

Incidentally, you are not seeing the chemical composition in the multispectral images. That would require ionizing the material. Look at it this way. Suppose you had an iron plate. It would look one way at ambient, and quite different if you heated it with a torch, but the composition hasn't changed.

Somewhere on ATS and also on alt.conspiracy.area51, I've posted photos of the Papoose Hills. Not the greatest. I did some quick shots from about 9000ft, nearly at the top of Bonanza Peak. I lost the trail near the top, but will go back this fall and try to reach the top again. It probably has a view of Groom Lake. A distant view. ;-)



posted on Aug, 21 2010 @ 01:33 PM
link   
reply to post by gariac
 

What the multispectral data shows you is the amount of electromagnetic radiation coming from each pixel's worth of Earth's surface (or clouds/whatever) for each of the 242 frequency bands the Hyperion instrument records, whether that radiation be from an emissive source (lights/thermal radiation) or from the "reflection" of light from the Sun or elsewhere. With the exception of thermal radiation, which really is only dependent on temperature, most of the photon emissions (and so most of the electromagnetic radiation) is caused by the Sun's light being "reflected" (absorbed by an electron, exciting it, then re-emitted when the electron drops to a lower energy level) towards the spacecraft by the surface. The "reflected" photon's wavelength is the same as the photon it absorbed from the Sun, but not all of the Sun's light is re-emitted. The chemical composition of the surface determines which wavelengths are re-emitted and which are absorbed. The spectrum of light "reflected" off of a surface is the "color" of the surface. Our eyes (like most cameras) ditch most of the incoming wavelength information and just add up all the light in a few broad wavelength bands that are useful to us. With 242 bands instead of 3 (RGB), you can get much more information about that surface. I probably overstated how much can be determined from the 242 bands by saying something like "detailed chemical analysis" but you can certainly distinguish surfaces with different chemical compositions that may appear identical to the naked eye, or to the naked tricolor CCD. After all, that is the intended purpose of the Hyperion and Ali instruments on EO-1. Plus, since they cover not only visible wavelengths but also NIR & SWIR (IR-A & B), you can possibly determine something about the temperature of the surface, though that is not an area I know much about. I know all the other stuff from the research for a paper I wrote but have not yet published and from writing a spectrum-based photon tracer in an attempt to create the must realistic (and incredibly slow) reference renderer ever. You should have seen the file size for a 640x480 render.


Oh, by the way, I uploaded the entire image I made yesterday (or whenever that was), linked below. The Papoose/Groom picture was a small portion of this big one.

It's 5493x11343 if you click on it for the full size image.



A note about the image. It uses the three longest wavelength bands of the Ali imagery and is scaled up 300% and multiplied over the Ali panchromatic imagery (which has 3 times the resolution), so the chroma resolution (if you want to call it that) is only a third of the full resolution.

I'm going to start playing with some Hyperion imagery next to take advantage of its much narrower, much more numerous wavelength bands.

By the way, for Windows users, IrfanView (google it) 4.27 will load the .TIF files quickly, and although it does just convert them to 8-bit, it range-compresses (or similar) them first (how thoughtful) to a reasonable level so you don't end up with a virtually black image. IrfanView is free and might be available on other platforms too. I used it to convert the .TIFs that I used to make the above image.

I may write my own little Hyperion/Ali imagery viewer and if I do, I'll post it online. It would be nice to have it automatically find/calculate certain things and just having a nice interface from which to see the imagery and switch seamlessly between wavelength bands would be nice. We'll see. I don't know anything about TIFF.



posted on Aug, 21 2010 @ 03:23 PM
link   
Imagej also runs on windows. For windows 7, you need to be administrator. There are permission difficulties if you are not administrator regarding saving files. I will say imagej runs better under linux. I suggest not using their bundled java, but install your own.

Imagej has a forum frequented mostly by biologists. It has a large contingency of scientists. I think you can run your opinion of what multispectral analysis sees by them.

Imagej has probably a hundred plugins.



posted on Jun, 13 2011 @ 06:29 PM
link   
reply to post by shmuu
 


Am I missing something here?
why get excited about high-data-content imagery and do fly-bys over Groom Lake area
if I can see clearly both runways on Google image?



posted on Jun, 13 2011 @ 07:34 PM
link   
reply to post by arrow2fast
 


The goal is to find disturbed areas that do not show up in visible light. The resolution is so bad I'm not convinced this is useful.

In any event, this kind of analysis really needs to be done on a baseline first. That is, you study some area where you have physical access, then see if the multispectral analysis finds something odd. Having spent plenty of time in the desert, sometimes the ground is a funny color or a bit different in plant life for no apparent reason, or more accurately for no man made reason.

I have spent time looking at GE for clues to plane crashes. Unless the resolution is really high, satellite observation doesn't reveal very much. It was only in the last year or so that crash sites can be seen on google earth. So finding something deliberately designed to be hidden seems unlikely by satellite.

Incidentally, I managed to get cinepaint installed via a repository. That is one ugly program to build.




top topics



 
2

log in

join