It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

AI POLICE STATE: China to use technology to predict crimes BEFORE they happen

page: 1
8
<<   2 >>

log in

join
share:

posted on Jul, 24 2017 @ 04:39 PM
link   


Oh boy is this a step in the wrong direction-even for China. AI to determine when someone is is going to do something "bad".


Much like in the 2002 film Minority Report, starring Tom Cruise, authorities in the east Asian country want to catch criminals before they have done any wrongdoing.

The police in the surveillance state have enlisted the help of AI to determine who is going to commit a crime before its happened.

Li Meng, vice-minister of science, said: “If we use our smart systems and smart facilities well, we can know beforehand… who might be a terrorist, who might do something bad.”
www.express.co.uk...

If a citizen visits a "weapon shop" the firm can combine this date with other data to assess the chance of committing a crime. Big data can determine highly suspicious groups of people based on where they go and what they do.


For example, if a citizen is to visit a weapons shop then the firm can combine this with other data to assess the individuals chance of them committing a crime.

Cloud Walk spokesperson Fu Xiaolong said: “The police are using a big-data rating system to rate highly suspicious groups of people based on where they go and what they do.”

He added that the risk rises if the person “frequently visits transport hubs and goes to suspicious places like a knife store”.


AI is watching you...



posted on Jul, 24 2017 @ 04:53 PM
link   
a reply to: seasonal

This is one of the scariest things I have seen in a long, long time. To all the mooks who say, "If you're not doing anything wrong, what do you care," when big brother continues to invade our rights, this is exactly what is coming to a democracy near you. I expect continuous smaller "terrorist" attacks that will eventually be followed by AI law enforcement rounding up the "suspicious" individuals. On the bright side, maybe AI LEO"s won't shoot so many of us.



posted on Jul, 24 2017 @ 04:56 PM
link   
a reply to: TobyFlenderson

Very well could be the plan.

With we can help all we need to do is turn on this dig data AI, and it will help to keep you safe.


Ahhh, false flags-all that practice has lead up to this, in theory so far.



posted on Jul, 24 2017 @ 04:57 PM
link   
a reply to: seasonal

Not surprised this is happening in China. But isn't finding terrorists a good thing anyway? At least that's what it appeared to me.



posted on Jul, 24 2017 @ 05:01 PM
link   
a reply to: Deaf Alien

Until the definition changes to you being the terrorist.



posted on Jul, 24 2017 @ 05:08 PM
link   

originally posted by: seasonal
a reply to: Deaf Alien

Until the definition changes to you being the terrorist.


Which really is the crux of the issue is it not!?

Unless we are in a different timeline where the governments of the past, the oligarchs/monarchs/elitists of all kinds, have been benevolent and not ever used their power to suppress dissent.

Wouldn't that be great!!



posted on Jul, 24 2017 @ 05:09 PM
link   

originally posted by: seasonal
a reply to: Deaf Alien

Until the definition changes to you being the terrorist.

Maybe. But cops always investigate suspicious activities.



posted on Jul, 24 2017 @ 05:22 PM
link   
a reply to: Deaf Alien

This is not that situation. It is the gathering of everything that can be gathered and then using algorithms to determine who is going to do something bad.



posted on Jul, 24 2017 @ 05:23 PM
link   

originally posted by: seasonal
a reply to: Deaf Alien

This is not that situation. It is the gathering of everything that can be gathered and then using algorithms to determine who is going to do something bad.

I am sure the FBI, CIA and NSA do the same thing to protect us. Many terrorist acts have been foiled.



posted on Jul, 24 2017 @ 05:25 PM
link   
Only it's NOT like Minority Report.

Do THEY have "precogs" ...


Psychic people in VATS that "see into the future"


No?

Then it's NOT like Minority Report.



posted on Jul, 24 2017 @ 05:28 PM
link   
a reply to: Deaf Alien

From the story.


“If we use our smart systems and smart facilities well, we can know beforehand… who might be a terrorist, who might do something bad.”


The patriot act does give the govt HUGE "tools" to combat terrorism. In fact prison with out a court ruling and no lawyers.
Do you subscribe to the school of "if I am not doing anything wrong I have nothing to fear" ?

I do not subscribe to that.



posted on Jul, 24 2017 @ 05:30 PM
link   
a reply to: seasonal

I always subscribe to personal privacy. I know it's a difficult subject. Privacy vs. security.



posted on Jul, 24 2017 @ 05:39 PM
link   
a reply to: seasonal

I'll bet it doesn't apply to corrupt politicians and military!



posted on Jul, 24 2017 @ 05:45 PM
link   

originally posted by: DanteGaland
Only it's NOT like Minority Report.

Do THEY have "precogs" ...


Psychic people in VATS that "see into the future"


No?

Then it's NOT like Minority Report.


No, they use wise old Chinese men sitting around in a Chinese restaurant drinking tea and smoking opium

They call them pre-Changs


forgive me for that



posted on Jul, 24 2017 @ 06:06 PM
link   
a reply to: DanteGaland

Precogs are the computers and big data.



posted on Jul, 24 2017 @ 06:07 PM
link   
a reply to: seasonal

Meh. We already have this tech and it's been used already.

It was in the news and everything.

www.zerohedge.com...




In this city’s urgent push to rein in gun and gang violence, the Police Department is keeping a list. Derived from a computer algorithm that assigns scores based on arrests, shootings, affiliations with gang members and other variables, the list aims to predict who is most likely to be shot soon or to shoot someone.

The police have been using the list, in part, to choose individuals for visits, known as “custom notifications.” Over the past three years, police officers, social workers and community leaders have gone to the homes of more than 1,300 people with high numbers on the list. Mr. Johnson, the police superintendent, says that officials this year are stepping up those visits, with at least 1,000 more people.

During these visits — with those on the list and with their families, girlfriends and mothers — the police bluntly warn that the person is on the department’s radar. Social workers who visit offer ways out of gangs, including drug treatment programs, housing and job training.

“We let you know that we know what’s going on,” said Christopher Mallette, the executive director of the Chicago Violence Reduction Strategy, a leader in the effort. “You know why we’re here. We don’t want you to get killed.”


But just because you have that big data. It's not enough.
edit on 24-7-2017 by grey580 because: (no reason given)



posted on Jul, 24 2017 @ 06:11 PM
link   
a reply to: grey580



Either Chicago is getting more violent, or their Minority report procedures need work.



posted on Jul, 24 2017 @ 06:12 PM
link   

originally posted by: seasonal
a reply to: grey580



Either Chicago is getting more violent, or their Minority report procedures need work.

I don't think you need much of a prediction in Chicago.



posted on Jul, 24 2017 @ 10:52 PM
link   
It might work in China but it won't work here. In China, the people are different so they might be able to tell if a person is going to get out of hand. If you want a Chinese kid to do something, you talk to them respectfully, to get a northern European kid to do something we yell at them. This continues into adulthood,, our cultures are totally different.

Everyone in our country would be in prison if we used the Chinese technology to predict crimes in advance.



posted on Jul, 25 2017 @ 07:09 AM
link   
a reply to: seasonal

I've said it before and I'll say it again.

You can't fix a problem without fixing the underlying causes of the problem.

Using big data to predict crime isn't going to solve the problem.




top topics



 
8
<<   2 >>

log in

join