It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
As long as we're heading into an age of predictive policing, it's good to know that some police departments are willing to turn the ThoughtCrime scanner on their own employees.
police departments across the U.S. are using technology to try to identify problem officers before their misbehavior harms innocent
people, embarrasses their employer, or invites a costly lawsuit — from citizens or the federal government.
Of course, some of this is just "insider threat" detection that ousts whistleblowers before they can blow the whistle and punishes employees for not adhering to the prevailing mindset. Nothing about this software is anywhere close to perfect, but it's still being used to (hopefully) head off police misconduct before it occurs. But what the system flags doesn't seem to be stopping cops before they do something regrettable.
The systems track factors such as how often officers are involved in shootings, get complaints, use sick days and get into car accidents.
When officers hit a specific threshold, they're supposed to be flagged and supervisors notified so appropriate training or counseling can be
assigned.
The LAPD's inspector general found in a recent review that the system was seemingly ineffective in identifying officers who ultimately
were fired. The report looked at 748 "alerts" over a four-month period and found the agency took little action in the majority of cases and
only required training for 1.3 percent, or 10 alerts, of them.
The LAPD presents this as a software failure -- and some of it is. What's being flagged isn't necessarily indicative of potential misconduct. But beyond the algorithm, there's this integral part which is being ignored.
Experts say the early warning system can be another powerful tool to help officers do their jobs and improve relations, but it is only as
good as the people and departments using it… "These systems are designed to give you a forewarning of problems and then you have to do
something."
Even the IG's report notices nothing's being done. 748 "alerts" only resulted in action on 10 of them.
The LAPD is trying to portray this as a software failure, most likely in hopes of ditching the system that was forced on it by its own bad behavior. (The irony here is that police departments will argue that predictive policing software doesn't work on cops but does work on citizens.)
on the other side of the coin lies the LA Sheriff's Department -- at least in terms of predictive software.
The sheriff's department has an early warning system. "Our diagnostic systems were fine," said the department's Chief of Detectives, Bill
McSweeney, who advised his agency on creation of the warning system. "Our managerial and supervision response was not fine. It's that
simple."
The LASD is finally acknowledging that it let its officers (and prison guards) act with impunity for far too many years. The system could have worked -- at least in its limited capabilities -- but no one wanted to follow up on flagged officers. The situation there has deteriorated to the point that the LASD is looking at a few years of federal supervision.
Predictive policing is still a bad idea, even for policing police. While data may help pinpoint problem areas, the flagging systems are far too inaccurate to guarantee hits. But the problem within law enforcement agencies is the lack of accountability, not faulty software. Unless the first problem is addressed, it won't matter how much the software improves in the future.
While such "early warning systems" are often treated as a cure-all, experts say, little research exists on their effectiveness or — more importantly — if they're even being properly used.
The LAPD is trying to portray this as a software failure, most likely in hopes of ditching the system that was forced on it by its own bad behavior.
originally posted by: Thecakeisalie
To paraphrase the forth doctor (ie doctor who)
"Computers are just sophisticated idiots. They can solve a million equations in a second but yet they still need someone to tell them what to do."
originally posted by: damwel
Abuse of power should be a capital crime. If a cop violates the public trust, execute him/her. Police should be held to a higher standard than the public.
originally posted by: AllSourceIntel
What is the answer? I don't think it is in predictive programing, but rather the public continuing, and we ever more use of, cameras trained and spotted on law enforcement; further, every state should make it law that dashcams and uniform cams be mandatory 24/7 recording - without the ability to turn them off.