This Robot Could One Day Help Hospital Patients Get Dressed

Researchers at Georgia Tech have developed a robot that taught itself to slide a hospital gown onto real humans.

Georgia Tech is developing a robot that could eventually be used to dress humans in hospitals and elderly homes. According to the university, 'More than 1 million Americans require daily physical assistance to get dressed because of injury, disease and advanced age.’

To address this problem researchers are developing a robot that can successfully slide on hospital gowns on people’s arms. The machine uses its understanding of force to guide the cloth over the patient's hands, around their arms, and onto the shoulder.

The robot known as PR2 taught itself to complete the task in just a single day. It learned by analyzing over 11,000 simulated examples of a robot putting a gown onto a human arm.

From analyzing these examples the PR2’s neural network learned to estimate the forces applied to the human. These simulations allowed the robot to feel what it was like for the human getting dressing assistance.

This Robot Could One Day Help Hospital Patients Get Dressed
Source: Georgia Tech

“People learn new skills using trial and error. We gave the PR2 the same opportunity,” said Zackory Erickson, the lead Georgia Tech Ph.D. student on the research team.

AI

This Free Robot Lawyer Helps People Get out of Paying Parking Fines

“Doing thousands of trials on a human would have been dangerous, let alone impossibly tedious. But in just one day, using simulations, the robot learned what a person may physically feel while getting dressed.”

The examples also allowed the robot to predict the consequences of different motions. In some cases moving the gown in one way led to the cloth being pulled tight, others slid the gown on smoothly.

The robot could use these predictions to select the right motions to comfortably and efficiently dress the patient.

Once the robot got good in simulation, it turned to real patients. Patient stood in front of the robot while it attempted to slide the gown onto their arm. Instead of looking at what it is doing, the robot used a sense of touch to get the job done.

“By predicting the physical implications of their actions, robots can provide assistance that is safer, more comfortable and more effective.”

“The key is that the robot is always thinking ahead,” said Charlie Kemp, an associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University and the lead faculty member. “It asks itself, ‘if I pull the gown this way, will it cause more or less force on the person’s arm? What would happen if I go that way instead?’”

During the human testing, the researchers adjusted the robot’s timing to increase its chances of success.

“The more robots can understand about us, the more they’ll be able to help us,” Kemp said. “By predicting the physical implications of their actions, robots can provide assistance that is safer, more comfortable and more effective.”

Currently, the robot is able to just slide the sleeve of the gown onto the arm of a patient, researchers say the ultimate goal of fully dressing a human requires much more development and research. Their paper, Deep Haptic Model Predictive Control for Robot-Assisted Dressing, will be presented May 21-25 in Australia during the International Conference on Robotics and Automation (ICRA). 

Via: Georgia Tech