Meet ALAN, a robot that requires minimal human supervision
Are we every day inching closer to a world run by robots? Or will they take over our jobs? It’s one of those questions that enter people’s minds with every new robot invention.
Researchers at Carnegie Mellon University have created ALAN, an autonomous robot, meaning that it can perceive its environment, make decisions based on what it perceives, and can possibly work for extended periods of time. It has been programmed in such a way that it can recognize and then move or manipulate tasks within that environment.
ALAN was found to successfully complete tasks in the real world after a brief number of exploration trials.
This robot, introduced in a paper pre-published on arXiv, is set to be presented at the International Conference of Robotics and Automation (ICRA 2023).
Watch ALAN interacting with objects in its environment and perform tasks:
One of the critical challenges faced by the team in their research was the task specification problem. Robots created previously require heavy reward engineering or regular human interaction in order to get them to perform their tasks. This also requires knowledge of the environment, which might be hard to obtain for every domain.
Instead, if robots can collect their own data using task-agnostic objectives and no human interference, they could then independently explore their environments and learn interesting skills.
"We have been interested in building an AI that learns by setting its own objectives," Russell Mendonca, one of the researchers who carried out the study, told Tech Xplore. "By not depending on humans for supervision or guidance, such agents can keep learning in new scenarios, driven by their own curiosity. This would enable continual generalization to different domains and discovery of increasingly complex behavior."
Mendonca added, "Next, we want to study how to utilize other priors to help structure the robot's behavior, such as videos of humans performing tasks and language descriptions. Systems that can effectively build upon this data will be able to autonomously explore better by operating in structured spaces. Further, we are interested in multi-robot systems that can pool their experience to continually learn.” Only about a month ago, a new startup claimed that the next generation of robotics would feature sensory skins, fabricated muscles, and artificial neurons printed on flexible materials that will allow them to feel.
Robotic agents that operate autonomously in the real world need to continuously explore their environment and learn from the data collected, with minimal human supervision. While it is possible to build agents that can learn in such a manner without supervision, current methods struggle to scale to the real world. Thus, we propose ALAN, an autonomously exploring robotic agent, that can perform many tasks in the real world with little training and interaction time. This is enabled by measuring environment change, which reflects object movement and ignores changes in the robot position. We use this metric directly as an environment-centric signal, and also maximize the uncertainty of predicted environment change, which provides agent-centric exploration signal. We evaluate our approach on two different real-world play kitchen settings, enabling a robot to efficiently explore and discover manipulation skills, and perform tasks specified via goal images.