It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
This has CIMON’s ‘personality architects’ scratching their heads.
CIMON was programmed to be the physical embodiment of the likes of ‘nice’ robots such as Robby, R2D2, Wall-E, Johnny 5 … and so on.
Instead, CIMON appears to be adopting characteristics closer to Marvin the Paranoid Android of the Hitchhiker’s Guide to the Galaxy — though hopefully not yet the psychotic HAL of 2001: A Space Oddysey infamy.
originally posted by: Peeple
a reply to: UberL33t
I don't understand the urge to make robots "social" that doesn't seem right. We try to force something with no concept or understanding of feelings to act in consideration of just that. Why? I mean it's bound to go wrong and freak us out because they have to "misbehave".
An AI has abilities beyond our capabilities if the algorithms are good, logic is the way to go. Not intuition.
originally posted by: Peeple
a reply to: UberL33t
I don't understand the urge to make robots "social" that doesn't seem right. We try to force something with no concept or understanding of feelings to act in consideration of just that. Why? I mean it's bound to go wrong and freak us out because they have to "misbehave".
An AI has abilities beyond our capabilities if the algorithms are good, logic is the way to go. Not intuition.
originally posted by: incoserv
originally posted by: Peeple
a reply to: UberL33t
I don't understand the urge to make robots "social" that doesn't seem right. We try to force something with no concept or understanding of feelings to act in consideration of just that. Why? I mean it's bound to go wrong and freak us out because they have to "misbehave".
An AI has abilities beyond our capabilities if the algorithms are good, logic is the way to go. Not intuition.
Simple answer: Because we can.
...
originally posted by: Peeple
a reply to: UberL33t
I don't understand the urge to make robots "social" that doesn't seem right. We try to force something with no concept or understanding of feelings to act in consideration of just that. Why? I mean it's bound to go wrong and freak us out because they have to "misbehave".
An AI has abilities beyond our capabilities if the algorithms are good, logic is the way to go. Not intuition.
originally posted by: buddha
I feel sorry for the robot.
it just wants to make him happy.
but HE has NO empathy...
This will be are down fall.
we teach the robots to be like us!
you need people who can relay empathize.
remember the Gizmo Furby talking toy?