It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Neural networking/AI for B-21

page: 1
5
<<   2 >>

log in

join
share:

posted on Sep, 20 2019 @ 12:00 PM
link   
Andmobile.twitter.com...





B-21 scoop: The Air Force is going to demonstrate some form of artificial deep learning in the software code running on the OFP for the B-21. Here's the transcript of my interview with Dr. Roper, SAF/AQ

...

Lots to unpack there. But suffice it to say that the traditional approach to aircraft certification isn't kind to a deep learning approach to software. It's required that every software line of code is validated and verified on the ground first. So this will be interesting.


So according to this Trimble exchange with Dr. Roper, the OFP of the FCS will be constantly rewriting itself in flight. Which as Trimble mildly takes note would be a certification nightmare, among other things...

This is particularly disappointing from the standpoint of their promotion of the Raider as a low-risk, mature program. Though this has been tossed about for over a decade, I'm not sure that's the direction I'd go on a program that absolutely has to go right and on schedule.



Some LoFlyte background info can be found here and here:


The LoFLYTE™ Neural Network Flight Control System is an important advance in aerospace technology because of the adaptive nature of the control system. The controller is designed to learn as it flies, so the control system, not the pilot, determines the most effective commands to give the plane for a particular situation.

During normal flight, the neural controller will use the data it receives from the telemetry system to compute the most efficient flight characteristics and adjust the control surfaces accordingly. However, where the neural control system has an enormous advantage over traditional control systems is during abnormal and unexpected flight conditions. For example, if the control system determines that the rudder is not responding, it will adjust quickly to control the aircraft using the remaining flight surfaces. Neural network control is necessary in hypersonic vehicles where the center of gravity of the vehicle can change significantly throughout the flight. The neural network can adjust to changing flight conditions faster than a human pilot, greatly enhancing the safety of the aircraft.



posted on Sep, 20 2019 @ 12:24 PM
link   
a reply to: RadioRobert

I'm not sure what you would like to discuss.

I would like to say that in my opinion we "as the people" should rapidley decide what we would AI to NOT do.

Regulations can not keep up, but we should set up some fundamental bounderies.

How? Can we? Do we need a defense? Is there?
edit on 20-9-2019 by EartOccupant because: Confusion



posted on Sep, 20 2019 @ 12:29 PM
link   
a reply to: EartOccupant

you can start with Asimov's "three laws of robotics" (which actually has 4 laws)

it's a go place to start, but if implementated as is, no warbots.



posted on Sep, 20 2019 @ 12:34 PM
link   
a reply to: thedigirati

I agree with those laws.

Unfortunatly we both know, as you pointed out, the world of warCraft will not obay tho those laws.



posted on Sep, 20 2019 @ 12:35 PM
link   
a reply to: RadioRobert

Blame the optionally unmanned requirement.



posted on Sep, 20 2019 @ 12:41 PM
link   
a reply to: EartOccupant

My concern as I noted was more in the impact this has on the Dem/Val and certification process and the somewhat aggressive timeline.

Can we make this work? Yes, it's been done before, and the problems are more or less the same. Will this increase the likelihood of delays and cost overruns? Yes, almost certainly.



posted on Sep, 20 2019 @ 12:46 PM
link   

originally posted by: mightmight
a reply to: RadioRobert

Blame the optionally unmanned requirement.



You don't need a neural network in the OFP for unmanned aircraft.

Neural networking in other areas would 100% help you achieve a better UCAV, though.


But within the operational flight program for your FCS? No.



posted on Sep, 20 2019 @ 01:10 PM
link   

originally posted by: RadioRobert
You don't need a neural network in the OFP for unmanned aircraft.

Neural networking in other areas would 100% help you achieve a better UCAV, though.

But within the operational flight program for your FCS? No.

But it isn't just an UCAV phoning home. It's an optionally manned platforms supposed to be capable to perform complex strategic strike missions autonomously. To achieve that it'll require a very robust neural networking architecture anyway. I don't see why you'd build around OFP subsystems.

Hence why that bs unmanned option is the issue. Either build an UAV or don't. There's nothing gained by putting an remote capability on manned platform.
edit on 20-9-2019 by mightmight because: (no reason given)



posted on Sep, 20 2019 @ 01:15 PM
link   
Fixing a sudden problem mid flight might take the pilot a lot of time and at high speeds over enemy lines and the AI probably could try a hundred different adjustments a second and see which ones work and keep trying to get things as good as possible.

It probably wont be as fancy as it sounds when its got to fit into a limited environment so will be more like a set of levers which will give a result and quickly will be able to work out which settings give the best results and the results of the first try will determine which levers get moved up or down and i'd doubt its really much use remembering faults and settings as they probably will be unique to the event and the code should be pretty easy to validate if its in ADA as its not the easiest language to muck things up with as the compilers are pretty strict.



posted on Sep, 20 2019 @ 01:51 PM
link   
AI will surpass us eventually. And it will learn like we do.





posted on Sep, 20 2019 @ 03:22 PM
link   
a reply to: Maxatoria

NASA was messing with a net in the 90s that could learn how the aircraft was supposed to fly, and could compensate for damage and systems failures. They flew the first round of tests, and it went dark.



posted on Sep, 20 2019 @ 04:25 PM
link   
Sounds like something fit for the Loyal Wingman program, or on something to send out remotely that might skip across the upper atmosphere. I can`t help but think of the stories from Roswell about how the alleged craft was operated neurally by the pilot.



posted on Sep, 20 2019 @ 04:26 PM
link   
a reply to: Zaphod58

I recall years ago reading about a fully coordinated attack two X-45's performed on a ground target where the two vehicles autonomously worked out how to best coordinate the attack themselves and executed it again communicating among THEMSELVES. Then poof nothing after.

We have huge gaps in drones and there is alot of stuff that has gone dark. I really think you me and Sam got something out there that night



posted on Sep, 20 2019 @ 05:12 PM
link   
a reply to: FredT

Yeah, they just did that again with Loyal Wingman. The controlling pilot designated the target, and the wingmen decided which was going to attack, the weapons, and the tactics without intervention.



posted on Sep, 20 2019 @ 05:19 PM
link   
a reply to: Zaphod58

And Kratos has a whole bunch of contracts to integrate various weapons on the Valkyrie. Including being able to carry two AAMs. heh. heh. heh.



posted on Sep, 20 2019 @ 06:10 PM
link   
a reply to: anzha

And Kratos is building the Gremlins as the main sub. They are busy, and a lot of the neural network talk dovetails into these all.


This specific application still seems unnecessarily ambitious as we go. No need to continuously rewrite your OFP "flight laws" after every flight. It'd also mean two different birds will have two different OFP laws so will fly (probably/hopefully unnoticeably) differently. And it makes one more thing to go wrong or having to get ironed out before IOC. Not sure I'm a fan, even if in theory it results in an ever so slight improvement.



posted on Sep, 20 2019 @ 06:39 PM
link   
Dont forget the tests DARPA did with thought control tech in the F35..



posted on Sep, 21 2019 @ 05:41 AM
link   
Stop stressing about it and let it roll



posted on Sep, 21 2019 @ 07:45 AM
link   
Just a thought here,
A FCS that constantly rewrites itself would be extremely difficult, if not impossible to hack. No two aircraft being exactly the same means every aircraft would have to be attacked individually. I dont think for a first strike weapon this is as silly a requirement as it sounds. Especially as Zaph pointed out if they have been working on something like this for over 25 years.



posted on Sep, 26 2019 @ 04:30 PM
link   
There has been allot of news as far as neural networks go, DARPA even did a TED talk showing how they believe they have simulated a rats brain running on just few server racks.

Neural networks learn by making mistakes, so i would assume they are just training the AI on the ground with supercomputers running millions if not more of flights.

the faster the computer the more flights can be simulated.


imagine a human with millions upon millions of flight hours under every condition conceivable. If the aircraft could maintain a high speed link to a supercomputer on the ground or in space running millions of simulations of the mission they are on it would make a great 'pilot' and would drop the cost per airframe because the supercomputer is on the ground or space somewhere.







 
5
<<   2 >>

log in

join