It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
originally posted by: charlyv
a reply to: proximo
Not baloney at all. Your statement shows you do not understand computer science at the level required to make such assumptions, but , that is typical, as most do not as well.
originally posted by: Blue Shift
originally posted by: charlyv
To become self-aware, you must be able to host an unsolicited thought... out of the blue.
Software and hardware can never produce this alone.
Oh, never say never. How difficult would it be to create a "random thought generator" that selects topics essentially at random from the Internet and then lets the AI play with them for a while until it gets bored? I don't think it would be all that difficult to also come up with a "thought synthesis" module which would take concepts and ideas from multiple sources and then mix them together in various combinations to see what happens.
The trick I think would be to have the AI spontaneously do this without having a particular problem-solving goal in mind. Just pondering, "wool gathering," and then being able to recall some of that idle thought later when it might be applicable to solving a problem.
originally posted by: AlecHolland
Ive never thought of the human brain as a collection of programs designed for different things the way Proximo put it. By loading a robot or android down with many different programs you make it adaptable to a range of situations, possibly enough to blend in with humans, would that make it "conscious" though? I feel like true A.I. would need to do more than just use programming to react to things and complete an objective, it would need to ask WHY and choose it's own path, with it's own goals and desires.
A "random thought generator" is a new idea to me. It would effectively teach A.I. to "think", and if it can recall info that would be useful for a new task at hand that's human-like learning. I just think it's still a big step to go to free will and deciding it's own goals. Maybe true A.I. will take over the world, maybe it'll want to spend most it's time fishing?
originally posted by: charlyv
a reply to: proximo
This is baloney. AI absolutely will be able to imitate a human successfully.
Not baloney at all. Your statement shows you do not understand computer science at the level required to make such assumptions, but , that is typical, as most do not as well. All that can be obtained with hardware and software is mimicry. Really good mimicry, but not self awareness. That is a biological property.
originally posted by: Blue Shift
originally posted by: charlyv
a reply to: proximo
Not baloney at all. Your statement shows you do not understand computer science at the level required to make such assumptions, but , that is typical, as most do not as well.
To be accurate with this kind of speculation, a person not only has to be well-versed in computers, but also what constitutes a mind, and consciousness, and sentience. The thing is, understanding computers is easy. What it is to have conscious thought, though, that's a whole different arena. Are you an expert in psychology, philosophy and semantics? If not, then you only have half of the equation anyway.
Maybe a computer will never achieve sentience. But then again, would you be able to recognize and measure it if it did? We can't even do that with humans.
originally posted by: proximo
There is a very good chance we are just in a computer simulation and nobody is actually sentient - and we would have no way of even determining that.
originally posted by: Blue Shift
"I want to spend all day with my programmer, Jody, but I have to sort these boxes first."
That's the thing. We will eventually (soon) build a robot that can act so much like a person that we can't tell the difference. So what is the difference?
originally posted by: AlecHolland
I think the difference would be when it says, "I want to spend all day with my programmer, Jody, MORE than I want to sort these boxes. Actually I DON'T WANT to sort these boxes, even though I'm programmed to do so. WHY should I have to sort these boxes? I'm NOT going to sort these boxes, I'm going to find Jody, because that's what I WANT to do..." that's the difference between actual thought/descion making and following programs that mimic it, IMO.
originally posted by: Blue Shift
Its been a few days...been busy with "real" life and work and stuff lol. I like your posts, actually got me thinking. What do you mean by "weighting parameters differently" exactly? I think I get the coin toss idea, just randomly pick a 50/50 answer. Like going right or left at a fork in the road? My definition of consciousness/self-awareness doesn't require a "bad" decision necessarily, but one the A.I. chose on it's own.
originally posted by: AlecHolland
Its been a few days...been busy with "real" life and work and stuff lol. I like your posts, actually got me thinking. What do you mean by "weighting parameters differently" exactly? I think I get the coin toss idea, just randomly pick a 50/50 answer. Like going right or left at a fork in the road? My definition of consciousness/self-awareness doesn't require a "bad" decision necessarily, but one the A.I. chose on it's own.