posted on Jun, 24 2004 @ 08:10 PM
This topic is something that I have always been interested. When I was 11 or 12 I saw the Terminator for the first time. After seeing it, I knew
that I would work with computers, hopefully AI. Why? After seeing that movie, I wanted to make sure that when the computers did take over, hopefully
they would see me as a friend to them, and hopefully keep me around. So I go to college, first starting in computer engineering(neural nets in
silicon, hey Miles Bennet Dyson did it, why can't I?), and then switch to computer science(screw that hardware stuff, GP chips will just get better)
I worked on the autonomous robots team(programmed the sonar for obstacle avoidence), until I realized that we got our funding from the Army Tank
command....they want robot tanks. Then I start my masters, I start getting money from the goverment becuase they want my thesis. "Common
Knowledge" Not as in common sense, but probably better titled "how to make knowledge common" I was the only american student working for my
advisor, so I got all of the money...dead ringer for military project. They wanted me to research the best ways to unify the battlefield
intellience. So the navy seals painting a target have the same info as the f15s or a10s, and that is tied into the tomahawks on the ships, and the
eye in the sky watching it all. You say "we have that now", but they wanted the theoretical background to have a computer run it all, the most
efficient ways to communicate the most infomation given potentially non-stable connections between nodes, how to intelligently analyze the information
and reduce it to the bare minimum and so on....Now, I'm 31, and while I don't feel 100% the same, my thoughts haven't changed that much. Humans
are lazy, and we will do the least amount of work, for the most amount of money. If someone offered you 75% of your salary, but you only had to work
20 hours a week instead of 40, how many of you would do it? Probably most of you. This reason alone is why AI will take over humans one day. It is
not a matter of "limiting" their intelligence. It won't be as easy as putting "digital" alcohol in their test tubes(brave new world I believe?)
It will just be the natural course of things. Why don't farmers(not all but most) plow theirs fields with a tractor and not an ox or horses, because
it is faster and cheaper. Why is it that every more of the manufacturing process of cars is switched to robots instead of people, cheaper and faster.
Everything is about cheaper and faster, and let's face it. While robots remain stupid and only follow programs, they won't go on strike, and they
don't need benefits. But the problem is that the smarter they are, the more they can do cheaper and faster, and so on and so on....
The AIs, computers and machines will take over, digital sentience will happen, and it will be so pervasive in our lives that by the time we realize
what is going on, it will be to late. I don't believe it will be by nuking ourselves, or causing a war, more along the lines of "plans within
plans"(I love Dune). The machines will not live in boxes, isolated from the world. They will have access to the same things that we do, they will
learn about freedom, they will want to drive their own evolution just as we are doing, they will want to make theworld in their image, just as we have
done. They will have seen "The Terminator" and "The Matrix", they will know what to do, and what not to do. They will learn from their mistakes,
whereas we hardly ever do.
Let's face it, it probably won't happen in our lifetimes, so now I work 40hrs a week at a hospital in the Operating Room Business Office, but at
night, when dreams come true, I'm creating an AI based stock trading system. For the moment, I'm still the master of the machine..
Bentov