MIT team designs robotic gripper that could soon clean our homes

"Now with reflexes, we think we can one day pick and place in every possible way, so that a robot could potentially clean up the house."
Amal Jos Chacko
The research team, from left to right: Sangbae Kim, Elijah Stanger-Jones, Andrew SaLoutos, and Hongmin Kim.
The research team, from left to right: Sangbae Kim, Elijah Stanger-Jones, Andrew SaLoutos, and Hongmin Kim.

MIT News/Jodi Hilton 

Reacting to stimuli on the fly has been exclusive to living beings for much until now. In a world of uncertainties, adjusting to changes in our surroundings has been a big reason why we’re still around as a species.

In their pursuit of giving robots a human touch, MIT engineers have now developed a gripper that grasps by reflex.

Reflexive control is a system that uses automatic responses, such as the knee-jerk reflex, to make decisions. Rather than start from scratch after a failed attempt, these systems enable robots to reflexively roll, palm, or pinch an object to get a better hold.

“In environments where people live and work, there’s always going to be uncertainty,” says Andrew SaLoutos, a graduate student in MIT’s Department of Mechanical Engineering. “Someone could put something new on a desk or move something in the break room or add an extra dish to the sink. We’re hoping a robot with reflexes could adapt and work with this kind of uncertainty.”

Most conventional modern robotic grippers rely on visual data, typically from cameras. These can cause a delay in the robot’s reaction time, especially when the attempt to grasp fails, and the robot needs to gather its wits, data from these cameras in this case, before getting going again. 

Kim’s team built a new platform that is more reflexive and reactive using fast and responsive actuators designed for a mini cheetah— the group’s four-legged robot designed to run, leap and adapt to various terrain.

In addition to this platform, the new design consists of a high-speed arm and two lightweight, multi-jointed fingers. A camera mounted to the base of the arm and custom-high bandwidth sensors at the fingertips record the force and location of any contact instantaneously and the proximity of objects in immediate proximity to the finger over 200 times per second.

The team programmed an algorithm that directs the robot to quickly activate a grasp maneuver in response to real-time measurements at the fingertips without involving the high-level planner. Instead, a lower decision-making level takes care of reflexes, simulating instinct.

Kim describes this as analogous to delegating tasks to lower-level divisions in a company rather than having the CEO micromanage every detail.

In the experiments conducted, the team observed their new design to grip objects more than 90 percent of the time without having to back out and start again while increasing the area of successful grasps compared to a conventional grasping controller by over 55 percent.

SaLoutos, along with his colleagues post-doc Hongmin Kim, graduate student Elijah Stanger-Jones, Menglong Guo SM’22, and professor of mechanical engineering Sangbae Kim, the director of the Biomimetic Robotics Laboratory at MIT, will present their design at the IEEE International Conference on Robotics and Automation (ICRA) in May.

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board