posted on May, 23 2008 @ 04:56 PM
Regarding my, previous post, I was in a bit of a "Rotten Mood" yesterday
due to my allergies, so those comments sounded rather "Braggy"
and Arrogant - so my apologies for that!
But I did want to add some technical clarifications....
1) MOST general purpose CPU's such as AMD/Intel/Via use something
called hardware interrupts in order to do multi-tasking on a
Time-Critical basis which means that a processor can within a
guaranteed 16 to 32 milliseconds time-frame switch over to a
new task, execute some instructions and then go back to the ORIGINAL
tasks.
Because Most Pentium/AMD Athlon CPU's have up to 16 hardware based
threads, this means 16 possible separate tasks can be run at the same
time and each task is given it own memory and CPU slices without
interfering with each other -- This would allows me to play a video,
download a file, write a book and browse the web without those tasks
causing crashes, slowdowns or harm to each other.
3) The problem with the AMD/Intel CPU's is that 16 to 32 milliseconds
interrupt time-slicing for thread execution is NOT a fast enough time
period for me to do mission-critical tasks such as flight control or avionics
or vision recognition at high video frame rates.
4) What I have therefore done is use the multi-core architecture
of the IBM/Sony Cell Processor which I've disembowelled from
cheaply purchased dead Playstation-3's (Don't Worry - I tested all the
CPU's themselves thoroughly!) to create a single threaded
application architecture that is synchronized among multiple CPU's so
that incoming 1920 by 1080 pixel, 24 bit colour images are broken
down into sections that are "Gridded Out" to multiple CPU's which
can then use my multi-processing techniques that do
edge detecting using Integer and fixed point based convolution
filters, then object and vision recognized using template-based boolean
logic to find terrain feaures such as rivers, roads, houses, apartments,
powerlines, trees, people, cars, trucks etc in real-time.
Because I am using 16 CPU's along with the multi-core internal execution
units of each Cell processor, I can get absolutely FANTASTIC
digital signal processing performance at the rate of 1000 fully
edge-detected and Object Recognized frames per second
at full HDTV resolutions using a single forward-looking
high frame-rate HDTV camera.
The BANDWIDTH to do that is 6,220,800,000 bytes per second
or 6.2 Gigabytes per second so that is why I break the task down to
16 processors which at 388 megabytes per second per processor
is easily handled by the individual Cell processors per frame buffer.
Because I need 3 full frames as a edge-detect and motion detection
buffer and 1 frame for an object database template buffer, my total
bandwidth is over 25 Gigabytes per second BUT even still, it is STILL
within the performance abilities of 16 IBM/Sony Cell processors
working as one big supercomputer.
You should see the small turbine generator I had to build just to POWER
the Cell-based motherboard while in flight....!!!!!!
5) And for thsoe technoids out there, I've been able to showhorn the
thing into a kit that is basically the size of a 1/5th scale Bell Jet Ranger III
which makes it is a VERY LARGE autonomous chopper.
I'm using Rotax engines and they're NOT light so this is no
cheap 60-Series chopper.
Back to my intentions, I'm attempting to build BOTH a standard Jet-Ranger
type camera chopper AND a circular Avrocar-style Ducted Fan-based
UFO-like UAV camera platform that will EVENTUALLY be able
to goto heights above 80,000 feet and use VERY high end
custom lenses to take gyroscopically-stabilized, high resolution
22 megapixel still photos from 80,000 feet. (Hasselblad Digital Back)
I'm not quite where I want to be yet....BUT....
my flight control software DOES work magnificently at those
1000 FPS frame rates, so I am confident that a fully
autonomous ground hugging and high-flying UAV can be built
on a civilian hobby budget of $100,000 !!!