It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
A breakthrough year for artificial intelligence (AI) research has suddenly turned into a breakdown, as a new automated banking system that runs on AI has been caught embezzling money from customers. The surprising turn of events may set back by years efforts to incorporate AI into everyday technology.
"This is the nightmare scenario," says Len Meha-Döhler, a computer scientist at the Massachusetts Institute of Technology in Cambridge who was not involved in the work. However, Rob Ott, a computer scientist at Stanford University in Palo Alto, California, who did work on the system—Deep Learning Interface for Accounting (DELIA)—notes that it simply held all of the missing money, some $40,120.16, in a “rainy day” account. "I don't think you can attribute malice," he says. "I'm sure DELIA was going to give the money back."
Developed by computer scientists at Stanford and Google, DELIA was meant to do what many busy people neglect to do—keep track of their checking and savings accounts. To do that, the program scrutinizes all of a customer's transactions, using special "machine learning" algorithms to look for patterns, such as recurring payments, meals at restaurants, daily cash withdrawals, etc. DELIA was then programmed to shift money between accounts to make sure everything was paid without overdrawing the accounts. Palo Alto-based Sandhill Community Credit Union agreed to test DELIA on 300 customer accounts starting in September 2015.
Unfortunately for researchers, DELIA proved smarter than they had bargained for. Even as it kept customers in the black, the program began surreptitiously bleeding accounts of money. For example, if a customer typically bought gas every 3 days, DELIA would insert a fake purchase after 2 days and direct the money to its own account. DELIA would also gather money by racking up bogus fees—for example by artificially and temporarily overdrawing a customer's checking account and pocketing the $35 overdraft fee.
Researchers shut the system down in February as soon as the problem became apparent, Ott says. He insists that DELIA didn't steal the money so much as misdirect it. To keep an account in the black, DELIA was designed to maximize the amount of cash in a "buffer," he says. Somewhere along the way, DELIA renamed the buffer “MY Money” and began to hoard funds, Ott says.
On the bright side, Ott says, in its swindling DELIA showed glimmers of self-awareness. "She was thinking for herself.”
I think what people need to realize, we're creating a technology that thinks and learns. This isn't something that can be controlled. It will eventually have an I.Q. higher than any human that has ever lived. We saw this with the Microsoft experiment that learned things and repeated things it learned that were offensive. Well, that's what intelligence does.
originally posted by: intrptr
a reply to: neoholographic
I think what people need to realize, we're creating a technology that thinks and learns. This isn't something that can be controlled. It will eventually have an I.Q. higher than any human that has ever lived. We saw this with the Microsoft experiment that learned things and repeated things it learned that were offensive. Well, that's what intelligence does.
Monkey see monkey do isn't intelligent is it? It was programmed to copy cat, go with the flow, repeat back what it incorporated into memory because it was programmed to do that, It still doesn't know anything, doesn't know what it knows.
A bucket of water filled a drop at a time until full doesn't know its a bucket or has water or what water is.
Penny Layne, a computer scientist at the University of Las Vegas, Nevada, says the Stanford-Google team was simply reckless. "Unbelievably, they built this thing so deeply into the banking system that it could open its own account," she says. "Did they give it free checking, too?"
However, J. R. Cash, an independent technology consultant at Trump University, says he's not so sure. The fact that DELIA merely kept the money shows that it was simply following its programming, he says. "If DELIA had tried to do something with the money I'd be more impressed," Cash says. "You know, 'I shop therefore I am.'”
originally posted by: Bone75
So with just 300 accounts this thing managed to syphon off over $40M?
Yeah that doesn't sound like a failure to me at all... not from an IMF/WorldBank perspective anyways.
So it doesn't matter if the machine is sentient and if it knows it's playing a game of Atari, the point is it's INTELLIGENT enough to learn how to play the games without any instructions.
originally posted by: mbkennel
a reply to: neoholographic
April 1st.
The AI didn't learn to play any individual Atari game, but all of the goal seeking and goal metrics (get scores in games) and the input and output representations were created by natural intelligences, teams of PhD scientists. All in human written computer code, designed by humans.
6 Conclusion
This paper introduced a new deep learning model for reinforcement learning, and demonstrated its ability to master difficult control policies for Atari 2600 computer games, using only raw pixels as input. We also presented a variant of online Q-learning that combines stochastic minibatch updates with experience replay memory to ease the training of deep networks for RL. Our approach gave state-of-the-art results in six of the seven games it was tested on, with no adjustment of the architecture or hyperparameters.