Babies are cute. This is an undisputed fact. Generally, robots don't endear us like little humans can, but a new technology is introducing the way we learn to walk into the virtual minds of bipedal robots.
Researchers built a two-legged robot capable of teaching itself how to walk using a process called "reinforcement learning," according to a recent study shared on a preprint server.
A simulated robot prepared the real one to walk
This research involved the construction of a robot with two legs attached and connected via a small holding frame. As of writing, the robot can execute tasks tethered to the frame, which researchers guide. While it doesn't look impressive compared to more fully-developed robots like Boston Dynamics' Spot, the one from the new study — named Cassie — is pushing the leading edge of a new type of technology involving a robot's ability to teach itself how to walk. Instead of improving via direct programming or mimicry, it learns from experience.
Also called reinforcement learning, this is how human babies learn to walk. Like babies, the robot doesn't learn everything at once — slowly gaining more information, trying and failing and failing better to put one foot in front of the other. And even after babies can walk, they get even better. On a long enough timeline, people can perform tricks like running, jumping, or even skipping down the sidewalk. Skipping!
For the robot to learn the same way, the researchers — from the University of California, Berkeley — started with a simulation of a robot in a digital world. In the virtual world, the robot used information about goals like walking upright; an AI engine was able to remember the results and employ the lessons learned on the next attempt. Eventually, the simulation helped the robot teach itself to walk without damaging any of the hardware — hastening the process.
Pros and cons of bipedal robots
After the simulated robot learned to walk in the simulation, the researchers transferred this knowledge to Cassie, which used it to walk like a toddler. Just like a big baby, Cassie continued learning, avoiding a fall when slips happen, and recovering when shoved from the side. The robot also successfully compensated when two motors were damaged. "The learned policies enable Cassie to perform a set of diverse and dynamic behaviors, while also being more robust than traditional controllers and prior learning-based methods that use residual control," read the study. "We demonstrate this on versatile walking behaviors such as tracking a target walking velocity, walking height, and turning yaw."
The future is wide open for the broad implementation of robots across more industries than ever before, but some organizations are concerned. In February, a provocative marketing collective called MSCHF bought a robot dog from Boston Dynamics and armed it with a Tippmann 98 paintball gun. The collective then allowed people to control the robot remotely with their phones as it moved through an art gallery filled with the company's work — during an event called "Spot's Rampage."
"When killer robots come to America they will be wrapped in fur, carrying a ball," read a manifesto from MSCHF. "Good boy, Spot! Everyone in this world takes one look at cute little Spot and knows: this thing will definitely be used by police and the military to murder people. And what do police departments have? Strong unions! Spot is employee of the month. You never need to union bust a robot — but a robot can union bust you."
Boston Dynamics did not endorse the message, but the tendency to humanize Cassie the robot like a baby could serve to mask the dangers bipedal robots might pose to the public, if built, captured, or reprogrammed for bad intentions. It's important to say that Cassie isn't designed to hurt people (and probably can't). But in addition to a sneak peek at the future of industry hands, we might also be seeing the very early development of technology capable of being used for bad ends.