It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Army researchers are teaching artificial intelligence to learn from humans to become sharper shooters.
At the annual Intelligent User Interface conference, scientists from DCS Corp and the Army Research Lab revealed how training a neural network on datasets of human brain waves can improve its ability to identify a target in a dynamic environment.
The approach teaches AI to spot when a human has made a targeted decision, with hopes that it could one day be used to assess a battlefield scenario in real-time.
The study comes as part of the multi-year Cognition and Neuroergonomics Collaborative Technology Alliance, according to Defense One.
And, it’s a step toward artificial intelligence that can solve problems in a changing environment, such as a military setting.
‘We know that there are signals in the brain that show up when you perceive something that’s salient,’ said Matthew Jaswa, one of the authors on the paper.
By training an AI to recognize these signals using massive datasets of human brainwaves, it could one day be able to instantly understand, in real time, when a solider is making a targeted decision.
DARPA revealed it is funding eight separate research efforts to determine if electrical stimulation can safely be used to 'enhance learning and accelerate training skills.'
Ultimately, doing this could allow a person to quickly master complex skills that would normally take thousands of hours of practice.
The program, called the Targeted Neuroplasticity Training (TNT) program, aims to use the body's peripheral nervous system to accelerate the learning process.
This would be done by activating a process known as 'synaptic plasticity' – a key process in the brain involved in learning – with electrical stimulation.
originally posted by: Joneselius
a reply to: neoholographic
We're being told that AI will never, ever turn on humans whilst at the same time they're training it to turn on 'some' humans?
This is PRECISELY why AI will just kill us all. We're so annoying.
The question of what will happen when adversaries deploy autonomous weapons that do not seek a person in the loop for approval to use lethal firepower looms on the horizon for all militaries defending democratic states.
It seems reasonable to believe that even those states that have set some limits on AI capabilities will encounter adversaries who have no qualms about doing so, putting the states that limit integrating AI for national security at a considerable disadvantage.
originally posted by: DirtWasher
a reply to: Zanti Misfit
That video is 2 hours long and you do not give the slightest hint as to the relevance of your claim. What's the timestamp, at least? I'm hoping you made a mistake and are not just lazily posting in a great thread.
originally posted by: DirtWasher
a reply to: nickyw
I have no idea how AI could develop some sort of PTSD. AI today is not sentient, but perhaps some sort of similar effect could be achieved. Most vets I have met with PTSD abhor violence and can recognize the bs nature of foreign and domestic policy where violence is involved.
If such a thing were to occur with sufficiently advanced AI, then there may well be a precedent for true objective morality and ethics based on the experience of AI and how it is less likely to do something that it knows will cause spikes of internal conflict.