It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

When Drones Decide to Kill on Their Own

page: 1
3

log in

join
share:

posted on Oct, 1 2012 @ 09:33 AM
link   
The U.S. military has made progress developing drones that can operate with little, if any, human supervision. People in the U.S. military insist that when it comes to lethal strikes, this new generation of drones will remain under human command.

Nevertheless, UAVs will no longer be the “dumb” drones in use today; instead, they will have the ability to “reason” and will be far more autonomous, with humans acting more as supervisors
than the ones behind the controls. Talk about one hell of a ethical dilemma.

 

Starting a New Thread?...Look Here First

AboveTopSecret.com takes pride in making every post count.
Please do not create minimal posts to start your new thread.
If you feel inclined to make the board aware of news, current events,
or important information from other sites
please post one or two paragraphs,
a link to the entire story,
AND your opinion, twist or take on the news item,
as a means to inspire discussion or collaborative research on your subject.



thediplomat.com...
edit on Mon Oct 1 2012 by DontTreadOnMe because: IMPORTANT: Using Content From Other Websites on ATS



posted on Oct, 1 2012 @ 09:38 AM
link   
reply to post by travis911
 


I doubt the military will go foward with this. Too much a big risk that the drone decides to kill the wrong enemy. However evolved they may seem, AIs are pretty dumb. I know, I have one.

Too much a big risk that the drone decides to destroy the wrong target. The military itself would face danger.
edit on 1-10-2012 by swan001 because: (no reason given)



posted on Oct, 1 2012 @ 09:53 AM
link   
reply to post by travis911
 


I'm with the Swan on this one. If I were in the military I would be afraid of the "friendly fire" possibilities. We have enough trouble with accidents without worrying about drones gone rogue. Let's hope they think long and hard before taking such an unnecessary risk!



posted on Oct, 1 2012 @ 10:05 AM
link   
reply to post by travis911
 


And I suppose that the drones could be programmed with facial recognition software and images could be uploaded to the drones and they could kill only those targets....so lets say someone on the senate appropiations committee decided to go against more funding for this program, their image could be uploaded and he/she would be taken out and it would be blamed on a program error....of course it could happen to ANYONE!



posted on Oct, 1 2012 @ 10:09 AM
link   
reply to post by rangersdad
 


Facial recognition is pretty dumb. I have a camera which has that and it thinks a letter is a face. What would the drone do if that happened to it? It would shoot into Las Vegas's shiny letters? It would target The Statue of Liberty's face?



posted on Oct, 1 2012 @ 10:10 AM
link   

Originally posted by rangersdad
reply to post by travis911
 


And I suppose that the drones could be programmed with facial recognition software and images could be uploaded to the drones and they could kill only those targets....so lets say someone on the senate appropiations committee decided to go against more funding for this program, their image could be uploaded and he/she would be taken out and it would be blamed on a program error....of course it could happen to ANYONE!


sure.. but what about lookalikes dopplegangers ect? i've seen a few people that look exactly like the people i grew up with over 2000 miles away.



posted on Oct, 1 2012 @ 10:11 AM
link   
reply to post by christoph
 


Another good point.



posted on Oct, 1 2012 @ 10:28 AM
link   
reply to post by swan001
 


Just because your consumer model has a cheesy facial rec program, doesnt mean that the govts will have that same one....theirs will be alot better than the ones available to the general public.



posted on Oct, 1 2012 @ 10:31 AM
link   
reply to post by christoph
 


Even dopplegangers have some discrepancies....they wont be a perfect match. Even if you wore a halloween mask that made you look like Nixon and were going around making the peace sign, saying I am not a crook, it should be able to tell the difference between a mask and a real face.



posted on Oct, 1 2012 @ 10:31 AM
link   
reply to post by christoph
 


Even dopplegangers have some discrepancies....they wont be a perfect match. Even if you wore a halloween mask that made you look like Nixon and were going around making the peace sign, saying I am not a crook, it should be able to tell the difference between a mask and a real face.



posted on Oct, 1 2012 @ 10:42 AM
link   
I've seen this scenario played out in a tv show.
It went terribly wrong......

Imagine the drones deciding to take out lawbreakers, deciding they're the enemies...



posted on Oct, 1 2012 @ 11:06 AM
link   
reply to post by snowspirit
 


Or what if you are just having a bad day and everything started out bad and its getting worse? It might sense your body language as a threat (like the Marthas did on Eureka) and decide to incapacitate(sp?) you.....



posted on Oct, 1 2012 @ 12:48 PM
link   
reply to post by rangersdad
 


Maybe. We areN't the government so we can't be sure. But facial recognition works by mapping dark/clear zones pattern, it's not a 3-d simulator. The drone would have to scan all 3 dimentions of the target's face very closely before deciding if it's a match. 2-d recognition means it can confuse with still pictures and other things like that.




top topics



 
3

log in

join