Humanoid Robot 'LOLA' Takes Baby Steps By Using Hands to Balance
Roboticists have long studied multi-contact locomotion, a process that allows human babies to take their first tentative walking steps and earn those bipedal bragging rights.
A decade ago, researchers from the Technical University of Munich developed a humanoid robot called LOLA as a research platform for algorithm development. As per an Inceptive Mind report, the team has now given the robot a series of major upgrades, allowing it to take its own baby steps in walking via multi-contact locomotion.
LOLA is an electrically actuated humanoid robot that weighs 68Kg (150lb), is 176 cm (5.7 feet) tall, and has 26 distributed joints. Similar in build to Boston Dynamics' Atlas robot, the model also features a pair of depth cameras in its head, which allows it to draw up a volumetric map of its surroundings.
The new algorithm used for multi-contact locomotion uses a proactive, rather than reactive, approach to multi-point contact stabilization. This means LOLA can adapt dynamically to its environment. One of the main differences is that the robot now uses its hands for balance and stabilization, much in the same way as humans.
Impressive baby steps for Robot LOLA and bipedal locomotion
A video (below) released by the researchers shows the Robot LOLA performing various multi-contact maneuvers. In the video demonstration, the researchers essentially rendered the robot blind so that it could show how it uses multi-contact locomotion, rather than volumetric data, to keep its balance.
In various scenarios, the robot is shown to keep its footing thanks to hand balance, including walking over moving surfaces and being pushed — though, we must say, rather less vigorously than in those famous Boston Dynamics Skynet-bating videos.
The research into the Robot LOLA is very much in its early stages — for the latest experiments, hand contact points and foothold positions were manually set. Still, the team hopes to soon automate these techniques. What's more, the use of multiple contact points dynamically is performed by the robot in real-time, and is responsible for some of the most human-like robotic movement we've seen.