It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Robots are getting smarter, but they still need step-by-step instructions for tasks they haven't performed before. Before you can tell your household robot "Make me a bowl of ramen noodles," you'll have to teach it how to do that. Since we're not all computer programmers, we'd prefer to give those instructions in English, just as we'd lay out a task for a child.
But human language can be ambiguous, and some instructors forget to mention important details. Suppose you told your household robot how to prepare ramen noodles, but forgot to mention heating the water or tell it where the stove is.
In his Robot Learning Lab, Ashutosh Saxena, assistant professor of computer science at Cornell University, is teaching robots to understand instructions in natural language from various speakers, account for missing information, and adapt to the environment at hand.
Saxena and graduate students Dipendra K. Misra and Jaeyong Sung will describe their methods at the Robotics: Science and Systems conference at the University of California, Berkeley, July 12-16.
The robot may have a built-in programming language with commands like find (pan); grasp (pan); carry (pan, water tap); fill (pan, water); carry (pan, stove) and so on. Saxena's software translates human sentences, such as "Fill a pan with water, put it on the stove, heat the water. When it's boiling, add the noodles." into robot language. Notice that you didn't say, "Turn on the stove." The robot has to be smart enough to fill in that missing step.
Saxena's robot, equipped with a 3-D camera, scans its environment and identifies the objects in it, using computer vision software previously developed in Saxena's lab. The robot has been trained to associate objects with their capabilities: A pan can be poured into or poured from; stoves can have other objects set on them, and can heat things. So the robot can identify the pan, locate the water faucet and stove and incorporate that information into its procedure. If you tell it to "heat water" it can use the stove or the microwave, depending on which is available. And it can carry out the same actions tomorrow if you've moved the pan, or even moved the robot to a different kitchen.
Other workers have attacked these problems by giving a robot a set of templates for common actions and chewing up sentences one word at a time. Saxena's research group uses techniques computer scientists call "machine learning" to train the robot's computer brain to associate entire commands with flexibly defined actions. The computer is fed animated video simulations of the action -- created by humans in a process similar to playing a video game -- accompanied by recorded voice commands from several different speakers.
The computer stores the combination of many similar commands as a flexible pattern that can match many variations, so when it hears "Take the pot to the stove," "Carry the pot to the stove," "Put the pot on the stove," "Go to the stove and heat the pot" and so on, it calculates the probability of a match with what it has heard before, and if the probability is high enough, it declares a match. A similarly fuzzy version of the video simulation supplies a plan for the action: Wherever the sink and the stove are, the path can be matched to the recorded action of carrying the pot of water from one to the other.
Other wise they will most likely just kill us off, maybe not right away but eventually.
originally posted by: intrptr
a reply to: knoledgeispower
Other wise they will most likely just kill us off, maybe not right away but eventually.
They already are. Whats a drone but an intelligently guided killing machine?
All weapons of war are the perfect killing machine. And the people that man them execute their tasks robotically.
Whats intelligent about that?
originally posted by: charles1952
"Robot can be programmed by casually talking to it?"
So what? So can husbands.
Sorry, off-topic, but I had to, I had to, I tell ya.
That's not what I'm talking about when I say intelligent robots and you know that.
I'm talking about a robot that can think for itself not be controlled by man.
originally posted by: intrptr
a reply to: knoledgeispower
That's not what I'm talking about when I say intelligent robots and you know that.
I know that. What you're proposing doesn't exist yet. I think you know that, too.
I'm talking about a robot that can think for itself not be controlled by man.
They already do that too. Besides autopilots, software programs run autonomously quite well.
The difference being sentience. AI implies that but is a long way off from real world.
Computers will never know that they know. At best they are only capable of executing the next instruction. Thats rote programming.
My comparison to supposedly intelligent people controlling bombers, missiles, tanks, drones, whatever is they are "programmed" to kill and they act this out robotically, without question. Thats akin to a computer program that "just runs programs".
If that helps.
The Actroid woman is a pioneer example of a real machine similar to imagined machines called by the science fiction terms android or gynoid, so far used only for fictional robots. It can mimic such lifelike functions as blinking, speaking, and breathing. The "Repliee" models are interactive robots with the ability to recognize and process speech and respond in kind.
So, what?
The ability to question and change your own programming will be an important step, but, for robots, I suspect that it will be based on another program telling the robot to question its program under certain conditions.