It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Meet Palantir, the modern Minority Report

page: 1
28
<<   2 >>

log in

join
share:
+4 more 
posted on Aug, 2 2017 @ 05:18 PM
link   
For those who always thought that science fiction takes a lifetime to manifest itself in daily life, think again.

In the 2002 movie, Tom Cruise relfected upon his own decisions with using pre-cognitive personas to fuel the engine for their pre-crime unit. Not only was it a movie, but one with a happy ending.

Real-life though, has a way of taking everything that is good about science fiction (especially the happy ending) and twisting it into some unrecognizable form of malfeasance.

I present to you Palantir.



Palantir — a CIA-backed startup created, in part, by PayPal’s billionaire co-founder Peter Thiel in 2004 — is a little-known tool that’s changing the world right under our noses.

Once used to predict roadside bombs in Iraq based on patterns of previous deployment, Palantir is now being deployed here at home for everything from law enforcement to finance.

The tool currently resides in a nondescript building on a back street in Palo Alto, California. From the outside, you might not think much of it. Inside, the technology is protected by walls that are impenetrable by radio waves, phone signals, or internet: its only means of entry secured with advanced biometrics, and pass codes held by dozens of independent parties whose identities are protected by blockchain technology.


Can anyone imagine the types of corruption that can be involved by implementing (woooops it already has been implemented) such a pre-cognition algorithm would ensue?

We're worried about Voter Fraud when there's this? Make sure you don't go outside dressed in a way that this algorithm would predict the person wearing who commits a crime that day would be wearing....you'd have a bad day.

Read all about it here
edit on Wed Aug 2 2017 by DontTreadOnMe because: trimmed overly long quote IMPORTANT: Using Content From Other Websites on ATS



posted on Aug, 2 2017 @ 05:48 PM
link   
That building should be burned to the ground.

Anyone tied to what's in there should be inside it when it happens.



posted on Aug, 2 2017 @ 05:52 PM
link   
This is only the beginning.



posted on Aug, 2 2017 @ 05:54 PM
link   
a reply to: alphabetaone

'Palantir'? Does that mean we get to call Langley 'Mordor'?



posted on Aug, 2 2017 @ 05:56 PM
link   

originally posted by: AugustusMasonicus
a reply to: alphabetaone

'Palantir'? Does that mean we get to call Langley 'Mordor'?



I imagine, it probably does.



posted on Aug, 2 2017 @ 06:03 PM
link   
As more technological doors open up loopholes laws will be introduced that will allow predictive AI to hold water in our court system. Predictive AI will be connected to every aspect of human living. This is exactly what Transhumanists(H+) want. Eradication of everything human is their endgame.



posted on Aug, 2 2017 @ 06:07 PM
link   
a reply to: alphabetaone

I would ask P@l@ntir "what am I thinking right NOW, smartass?"

Answer would be "I Do Not Know But I Can Postulate That It Might Be....." [robot voice]

To which I would respond... "Same thing with your 'future predictions'... nothing but a GUESS"



posted on Aug, 2 2017 @ 07:37 PM
link   
a reply to: alphabetaone

#. public sector needs its own AI to combat this. It's pretty alarming, its like they're setting up an external entity which they use to justify their actions, like a new kind of "God". Only, they have the means to influence this god and its outcomes.. Absolutely horrific.



posted on Aug, 2 2017 @ 07:45 PM
link   

originally posted by: FamCore
... nothing but a GUESS"


Likely true. But when that "guess" becomes policy to enact law enforcement, it brings some serious connotations along with it.



posted on Aug, 2 2017 @ 07:46 PM
link   
a reply to: WorShip

I do agree. It is absolutely horrific.



posted on Aug, 2 2017 @ 08:23 PM
link   

originally posted by: alphabetaone
Can anyone imagine the types of corruption that can be involved by implementing (woooops it already has been implemented) such a pre-cognition algorithm would ensue?

We're worried about Voter Fraud when there's this? Make sure you don't go outside dressed in a way that this algorithm would predict the person wearing who commits a crime that day would be wearing....you'd have a bad day.


In CS circles, Palantir has been well known for many years now. Putting aside the morality of their work, they have a reputation for only hiring the best alongside absolutely grueling interviews, hellish workdays, and invasive background checks.

One of the big consequences of this, is that their management basically adopted a churn and burn system to workers giving them very high turnover. Staying at the company for just a couple years would result in lots of seniority because people just wouldn't last.

This all brings up an interesting point to think about. When we think of national security in the private sector, what comes to mind is career people who have proven for decades that they can keep secrets and take it seriously. At Palantir, people learned the secrets and left within 6 months. It's quite a security risk now, because at some point people are going to talk if they haven't yet.



posted on Aug, 2 2017 @ 09:41 PM
link   
a reply to: Aazadan




In CS circles, Palantir has been well known for many years now.


What does thee "CS" stand for? Also,



When we think of national security in the private sector, what comes to mind is career people who have proven for decades that they can keep secrets and take it seriously. At Palantir, people learned the secrets and left within 6 months. It's quite a security risk now, because at some point people are going to talk if they haven't yet.


You seem to know a lot about Palantir.



posted on Aug, 2 2017 @ 09:59 PM
link   
a reply to: Aazadan

Not sure which is more frightening - OP or that post.

edit on 8/2/2017 by kosmicjack because: (no reason given)



posted on Aug, 2 2017 @ 10:00 PM
link   
a reply to: ZeusAduro

Lawsuits. Many, many lawsuits are in their future when this tech starts being used to target and convict people.

And hopefully these lawsuits will set a precedent that the data and the algorithm used in these systems are not allowed to be classified as trade secrets. As it shouldn't be because we should all have a right to know what data and information is being used as well as how the algorithm is programmed to predict these outcomes.

These lawsuits will also be used in attempt to limit this technology's use of predictive power. For example eliminating the use of data that will cause the machine to have a bias.

It's going to be an interesting next few decades.



posted on Aug, 2 2017 @ 10:05 PM
link   
a reply to: WorShip

The problem is what does the machine classify people that don't have a normal set schedule? A person of interest that deserves to be monitored even though they haven't done anything wrong except not have a consistent schedule? Or they prefer a certain type of clothing so they get targeted by the police because they like wearing hoodies?

The future of technology like this is lawsuits. Lots and lots of lawsuits. In fact with any luck this technology will be litigated into extinction.



posted on Aug, 2 2017 @ 10:07 PM
link   
a reply to: alphabetaone

Don't see the big deal it doesn't predict anything. What it does is look for patterns in information. Same thing us humans have been doing since we first appeared on the planet. Are brains are wired to look for patterns even where they don't exist.



posted on Aug, 2 2017 @ 11:27 PM
link   

originally posted by: dragonridr
a reply to: alphabetaone

Don't see the big deal it doesn't predict anything. What it does is look for patterns in information. Same thing us humans have been doing since we first appeared on the planet. Are brains are wired to look for patterns even where they don't exist.


Well, the big deal (and the argument ostensibly is not) doesn't really revolve around an ability to "predict", but in the belief that an algorithm is capable of positively identifying an action and thus becoming a means by which law enforcement can target citizenry. Do you really find it out of the realm of possibility that the justice system sometime in the near future would point to their "pre-cognition" machinery by which to oppress any group they chose to?

A human being is far more complex than any machine a scientist could build or even imagine, I mean, let's face it, if sentience could be manufactured, it already would have been...likewise with precognition. That, to me, is the big deal....not that it predicts, but that people will be lead to believe that it does.



posted on Aug, 3 2017 @ 06:43 AM
link   
a reply to: FamCore

CS in this case stands for Computer Science. Palantir hires a lot of CS professionals, focused mainly on the top recent grads. They're often used as a benchmark by those people, if you can successfully navigate an interview process from them and get a job offer it means you really know your stuff as they're very selective.

I know a little bit about the company, mainly on the tech side along with what sort of work they do. From what I've heard lately, the company is actually struggling to bring in talent now. Their reputation for being tough has crossed the threshold from helping them select candidates to turning a lot of potentials away.



posted on Aug, 3 2017 @ 12:05 PM
link   

originally posted by: Aazadan

This all brings up an interesting point to think about. When we think of national security in the private sector, what comes to mind is career people who have proven for decades that they can keep secrets and take it seriously. At Palantir, people learned the secrets and left within 6 months. It's quite a security risk now, because at some point people are going to talk if they haven't yet.


The fact that there is no way we can find out which private citizens hold other private citizens personal information is even possible. With a high degree of turnover, we dont know where these people landed, and that should be of great concern.

I'm less worried about them talking and more so about what strategies they attempt to employ given their knowledge.



posted on Aug, 3 2017 @ 01:03 PM
link   
a reply to: alphabetaone
This is big data, algorithms, and a bit of what may become a pretty decent (but limited) AI.

This type of processing has been around for a long time, and is now finally maturing. I can remember writing routines that would perform predictive analytics for manufacturing companies in the early 90's to guide then in how much and in what quantity of materials and chemicals to produce based upon past cycles.

The private sector has used predictive analytics for decades, and the federal government has been trying to use them for at least 20 years. They have been hustled quite a few times also (check out the NSA's Trailblazer Project) and have not had that much real success.

Long story short, as long as your information is for sale, the use of data analytics will continue and increase, mostly because it is a good tool for businesses to use to make money. I am not putting an evil shade to the practice (although it can be used for evil), but rather show it as a good business practice. What it becomes down the road is anyone's guess.

For instance, if a diaper company can find out you are most likely pregnant, why not send you some sample diapers before you give birth? You may develop brand loyalty.

If the serial numbers of the bills that you withdraw from an ATM in a local convienance store are recorded (and they are), and the records are sold to the store, they can track your spending habits at their store to know how much profit they make by having that machine on their premises. This is linked up to the system that works at the bank that the store deposits the funds into at the end of the day, which reads the bills serial numbers, and cross references them to withdraws from ATM machines. They sell the lists to the merchant.

You may see this as a big loss of personal information, but the stores, businesses, and bank see it as a way to understand profit, and a way to charge fees for new services in big data.

So this is happening right now, but most of it is happening in the private sector. Big brother is not as smart (yet) as the little guy running a Quicky-Mart.

edit on 3-8-2017 by TacSite18 because: (no reason given)




top topics



 
28
<<   2 >>

log in

join