It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Metadata is our context. And that can reveal far more about us — both individually and as groups — than the words we speak. Context yields insights into who we are and the implicit, hidden relationships between us. A complete set of all the calling records for an entire country is therefore a record not just of how the phone is used, but, coupled with powerful software, of our importance to each other, our interests, values, and the various roles we play.
Utah Data Center Technical Specifications Data Storage Capacity The storage capacity of the Utah Data Center will be measured in "zettabytes". What exactly is a zettabyte? There are a thousand gigabytes in a terabyte; a thousand terabytes in a petabyte; a thousand petabytes in an exabyte; and a thousand exabytes in a zettabyte. Some of our employees like to refer to them as "alottabytes". Learn more about the domestic surveillance data we plan to process and store in the Utah Data Center. Also, view our strategy for using the PRISM data collection program, nationwide intercept stations, and the "Boundless Informant" mapping tool to gather and track this data.
Code-Breaking Supercomputer Platform
NSA Utah Data Center supercomputer
The Utah Data Center will be powered by the massively parallel Cray XC30 supercomputer which is capable of scaling high performance computing (HPC) workloads of more than 100 petaflops or 100,000 trillion calculations each second.
Code-named "Cascade", this behemoth was developed in conjunction with the Defense Advanced Research Projects Agency (DARPA) to meet the demanding needs of the Intelligence Community.
reply to post by SkepticOverlord
Call content, email content, tweets, text messages, web page content, etc.
With human intelligence's (HUMINT) credibility often questionable regardless of nationality and/or ethnicity, this is perhaps the only way to process, clean and transform data based on the NSA's data models. This might perhaps be the only way for a quick turnaround in identifying possible scenarios ahead of time. How the trending analysis and its information is used is important (For good or for bad). Generally people used to collect information, enter them manually and dish out the survey/statistics on a yearly basis. Be it census or be it social security. With the use of computers, the task has become very easy and simple. Obviously most of the technology is exploited by the bad guys these days.
Originally posted by SkepticOverlord
reply to post by Archie
Yes, that was focused on data collection. I wanted to focus on data analysis.
whyamIhere
Now we can add everything we buy at the store including pharmacy.
Everywhere we drive is on video complete with license number collection.
Every time we use our ATM a complete record of what we bought and cash received.
Not to mention every time we travel complete with what is in our shoes.
My questions SO., are how long until our cash strapped government is selling this information?
Originally posted by JBA2848
nsa.gov1.info...
Utah Data Center Technical Specifications Data Storage Capacity The storage capacity of the Utah Data Center will be measured in "zettabytes". What exactly is a zettabyte? There are a thousand gigabytes in a terabyte; a thousand terabytes in a petabyte; a thousand petabytes in an exabyte; and a thousand exabytes in a zettabyte. Some of our employees like to refer to them as "alottabytes". Learn more about the domestic surveillance data we plan to process and store in the Utah Data Center. Also, view our strategy for using the PRISM data collection program, nationwide intercept stations, and the "Boundless Informant" mapping tool to gather and track this data.
Code-Breaking Supercomputer Platform
NSA Utah Data Center supercomputer
The Utah Data Center will be powered by the massively parallel Cray XC30 supercomputer which is capable of scaling high performance computing (HPC) workloads of more than 100 petaflops or 100,000 trillion calculations each second.
Code-named "Cascade", this behemoth was developed in conjunction with the Defense Advanced Research Projects Agency (DARPA) to meet the demanding needs of the Intelligence Community.
They say they will pump more then 170 million gallons of water through there just for cooling per day. This is no simple meta data storage facility. And who knows how many super computers are going to be in this building? They say what it can do but they don't say how many are planned. The Utah Data Center will be the Cloud that all intelligence agencies get there data from. And NSA will control who gets what.edit on 30-6-2013 by JBA2848 because: (no reason given)