Now robots can learn about the world the same way babies do

Teach me.
Teach me.
Image: AP Photo/Kirsty Wigglesworth
We may earn a commission from links on this page.

Babies try all sorts of experiments as they learn to interact with the world around them. It’s a happy, chaotic, organic process. Now a science team has tried something similar, by teaching robots how to interact with things by mimicking how children play.

When engineers or researchers teach a robot a new task, the process typically involves coding the device to move in specific ways to precise locations, or moving the machine by hand and having it record the movements so it can replicate them later. This learning process is very much the opposite of the accident-prone, playful, messy way that young babies learn to pick things up by knocking them over, or complete new tasks by watching what adults do and inferring how they could do the same thing. Babies, of course, quickly learn complex tasks, while robots have to be laboriously taught.

So University of Washington computer scientists and psychologists researching child development have explored how to teach robots new things by programming them to mimic how babies learn.

The researchers adjusted the learning algorithms of their experimental robot. Instead of strictly following programmed movements, the machine explored how different strategies worked when it was faced with a new task, and built a probability-based model of the world as it went.

The team concentrated on two very child-like behaviors: Learning to look where a human is looking, in case something interesting is happening, and moving toys around on a tabletop.

First, their robot explored how its own head could look around. When it saw a human head move to look at a distant object, the robot then modeled the human head movements and tried to move its own head to look at the same thing.

More interestingly, the scientists also let a robot nudge objects around on a tabletop so it could learn how its own movements could move other things in space. After it developed a model for this task, they asked the robot to watch a human sort or clear the table of objects and then do the same thing. The robot dutifully used its newly-learned model of the world to move toy food on the table… but sometimes it completed the task its own way, instead of blindly copying what the human did.

Both of these processes will be familiar to parents of young babies, and this is the whole point of the robotic research. The University of Washington robots, like babies, are starting with simple tasks. But this learning approach could, in the future, let machines quickly learn much more sophisticated things like ironing your clothes or tidying your home.