posted on Mar, 5 2006 @ 01:58 AM
I was reading a bit about the current state of Artificial Intelligence, and I have to say that what struck me the most about it was the lack of
discussion concerning developing hardware that can experience pain and pleasure, reward and punishment. There's a lot of discussion about mimicking
human responses, but not a lot about why human beings (or any intelligent creatures) respond the way they do.
So I was trying to imagine some kind of mechanical system that doesn't use organic components (to give myself more of a challenge, I guess), that
could be used to get a machine to display and act on a preference. Because even tiny little bacteria, with no brains at all, prefer some kind
of food over other, some kind of environments over others, and so on. But how do you get a machine to express a preference? Without specifically
programming it to do so, how do you get a machine to say that its favorite color is green, for instance? It would have to be generated by some
component of the machine that takes the information it gets from the world around it, and assigns an aesthetic value to it. Green gives it more
pleasure, for some reason.
It's not a software issue, I don't think. It has to be a mechanical issue. Something that allows the combination of both pleasure and pain
signals but also keeps them separate in a way. Like when a child cries to get candy. They experience frustration and pain when they don't get the
candy they want, but they wouldn't experience the frustration if they couldn't imagine the pleasure they would get from the candy. See what I mean?
A dynamic interaction between the two states.
Until this basic problem is solved, I don't see AI progressing much more beyond brute force computing to emulate but not duplicate human or animal
intelligence.