Advertisement

Robot Teaches Itself Empathy, Shocks Engineers

The Columbia Engineering robot correctly predicted its partner robot's intentions.

When people live together for a long time, they learn to predict how the other person will act, respond, or move. It comes naturally to most humans, but so far this not been the case for robots. 

Now, a Columbia Engineering robot can predict its partner robot's next moves and goals thanks to a few initial video frames. Essentially, it's sort of learned to empathize.

Their findings were published on Monday in Nature Scientific Reports.

SEE ALSO: SIMULATED ROBOT MANAGES TO CARRY ON WITH A BROKEN LEG

Robots can do all sorts of impressive tricks nowadays, from autonomously dancing to self-building, they keep improving. But they have yet to autonomously empathize or predict the movements of other robots. 

Humans, on the other hand, naturally and quickly learn to predict almost immediate actions of roommates, colleagues, or family when they're around them regularly. This makes life easier to manage when working or living in close quarters. Something robots only wished they could do. Until now. 

Researchers at Columbia Engineering’s Creative Machines Lab have now developed a robot capable of predicting another robot's intent. 

They did so by placing a robot in a three- by two-foot playpen and set it up to always go towards projected green circles. Sometimes a large red cardboard box was placed in the way, which stopped the robot in its tracks, unable to spot the green circles. From above, an observer robot took in the scene for a couple of hours and figured out how to predict the playpen robot's path, taking the researchers by surprise.

It turned out that this observer robot correctly predicted the other robot's intentions 98 out of 100 times — without being given any additional information about the red box or the path of the playpen robot. 

Advertisement

There's still a long way to go before robots actually feel empathy, but this is a sure way forward. 

"Our initial results are very exciting," said Boyuan Chen, lead author of the study.

"Our findings begin to demonstrate how robots can see the world from another robot’s perspective. The ability of the observer to put itself in its partner’s shoes, so to speak, and understand, without being guided, whether its partner could or could not see the green circle from its vantage point, is perhaps a primitive form of empathy," Chen continued.

Follow Us on

Stay on top of the latest engineering news

Just enter your email and we’ll take care of the rest:

By subscribing, you agree to our Terms of Use and Privacy Policy. You may unsubscribe at any time.