reply to post by RichardPrice
Bus and socket technologies have hugely changed in computers over the past 15 years, with significant speed increases and error handling along
the way.
The front-end has changed, considerably, sure. More has changed with the substrates between layers of board than the physical buses that front-end
plugs into.
And not in the case of fiber optics - which have largely replaced the old cannon-plug style avionics in data-centric airframes.
A hardware DSP will always, always *always* outperform a distributed architecture for the same power, cooling and space requirements. Always.
Simply, no.
There are two basic computational tasks. Serial and parallel. Every task can be expressed in both a serial and parallel form, but some are more
inherently one way or another (such as data compression being largely serial but raster rendering being ridiculously parallel).
The floating-point performance of a cheap, entry-level graphics processor runs circles around top-of-the-line x86-64 processing cores. You can run
physical simulations of the hardware-specific design on the graphics card that complete just as fast as the DSP can operate. With comparable power
requirements.
Certain specialized components - such as sensors, yes, will likely be custom designed (to interact with common bus interfaces, actually; much like
your camera has a USB standard interface).
The only reason we do not use DSPs in everything is because they are essentially single-task orientated, while most super computers are
designed to be used for multiple tasks.
You are talking about apples and oranges, here. The applications you are talking about are ancient. Back when you had the amplified radar return
physically driving the electron beam on a CRT radar scope. That went bye-bye with the AWG-9 centered on the 8080(or 86) processor.
It's an ancient design philosophy, even. Even in the flight control avionics - (fly-by-wire) - the pilot is never directly in control of the
aircraft. That is all run by computer. The computer takes the pilot's input and decides how best to contort the surfaces under the current and
projected conditions to get the desired outcome.
Electronics are no longer the monkey in the middle. They are the crew that interprets and applies the orders of the captain.
But a modern aircraft is full of tasks that can be happily processed by single-task chips - there is extremely little general purpose computing
involved in a modern aircrafts systems. Even the tasks you highlight can be handled much better by a hardware DSP than a general purpose computing
architecture - they are well defined problems with well defined solutions, you put X in and want Y out. You will always put X in and want Y out.
Therefor you do not use a general purpose computing architecture, you use specific hardware to do that.
Again, no.
The airframe is subject to thousands of different forces with monitors for them located throughout several hundred different arrays and sensors.
There are sensors on the airframe that sense the load on the wing - artificially limiting the aircraft's turning performance to keep it from buckling
under the G-forces of maneuvering. That has to be factored in with input from the pitot tubes - as there's an atmospheric anomaly detected at the
nose and about to hit the port intake in three milliseconds.
You're not just processing a signal. You're computing in gigabytes of physical environment simulation while monitoring (and contributing to) the
combat network, monitoring your own active/passive arrays, and presenting a clean, intuitive interface for the pilot to interact with.
You're talking about the hardware peripherals - the temperature sensors that run thousands of samples per second and process them for a slightly
slower standard interface; the pitot tubes feeding the environment simulation, etc. Yes, those are application specific.
However, when we're talking avionics - we're usually talking about the central core the pilot interacts with. 50 years ago, yes, that used to be a
CRT driven by Analog (and later, some digital) processors. The hardware was completely passive and required the pilot for everything.
Now, the avionics are interactive. It classifies radar contacts (processing even the smallest of returns), automatically counters jamming attempts,
etc. It does more than a whole room full of RIOs could do, and feeds that information to the pilot (and Flight Officer, if present).
It sounds a bit excessive, until you see some of the results. Information processing and distribution is the new thing in warfare. "Black" super-x
projects have largely been replaced by combat awareness networks and development of them.