Advertisement

Scientists engineer robots with realistic pain expressions to train doctors

The hope is to make treatment less painful and less biased.

Advances in robotics are helping train physicians to be better and perhaps more compassionate doctors. 

A team led by researchers at Imperial College London has conceived of a way to engineer robots with more accurate expressions of pain on the face, giving doctors in training an improved method for practicing on potential patients according to a press statement released by the institution on Friday.

The new robots will also be used for diversity training.

Get more updates on this story and more with The Blueprint, our daily newsletter: Sign up here for free.

Better facial expressions

“Improving the accuracy of facial expressions of pain on these robots is a key step in improving the quality of physical examination training for medical students," said in the statement study author Sibylle Rérolle, from Imperial’s Dyson School of Design Engineering.

The robots come in all kinds of shapes and hues mimicking differences in diversity. The purpose of these varied robots is to stop aspiring doctors from developing racial or gender-related biases.

Although some previous researchers have attempted to conduct bias training for medical practitioners, their approaches were not as reliable as this new method.

“Previous studies attempting to model facial expressions of pain relied on randomly generated facial expressions shown to participants on a screen,” said lead author Jacob Tan, also of the Dyson School of Design Engineering. “This is the first time that participants were asked to perform the physical action which caused the simulated pain, allowing us to create dynamic simulation models.”

Less than an hour of training

Perhaps the most advantageous aspect of this new development is how quickly it produces results.

Advertisement

"Current research in our lab is looking to determine the viability of these new robotic-based teaching techniques and, in the future, we hope to be able to significantly reduce underlying biases in medical students in under an hour of training," concluded Dr. Thrishantha Nanayakkara, the director of Morph Lab, the lab responsible for the engineering of these new robots.

The new study is published in the journal Scientific Reports.

For more robots used in medical applications, read this article here.

Study abstract:

Medical training simulators can provide a safe and controlled environment for medical students to practice their physical examination skills. An important source of information for physicians is the visual feedback of involuntary pain facial expressions in response to physical palpation on an affected area of a patient. However, most existing robotic medical training simulators that can capture physical examination behaviours in real‑time cannot display facial expressions and comprise a limited range of patient identities in terms of ethnicity and gender. Together, these limitations restrict the utility of medical training simulators because they do not provide medical students with a representative sample of pain facial expressions and face identities, which could result in biased practices. Further, these limitations restrict the utility of such medical simulators to detect and correct early signs of bias in medical training. Here, for the first time, we present a robotic system that can simulate facial expressions of pain in response to palpations, displayed on a range of patient face identities. We use the unique approach of modelling dynamic pain facial expressions using a data‑driven perception‑based psychophysical method combined with the visuo‑haptic inputs of users performing palpations on a robot medical simulator. Specifically, participants performed palpation actions on the abdomen phantom of a simulated patient, which triggered the real‑time display of six pain‑related facial Action Units (AUs) on a robotic face (MorphFace), each controlled by two pseudo randomly generated transient parameters: rate of change β and activation delay τ . Participants then rated the appropriateness of the facial expression displayed in response to their palpations on a 4‑point scale from “strongly disagree” to “strongly agree”. Each participant ( n=16 , 4 Asian females, 4 Asian males, 4 White females and 4 White males) performed 200 palpation trials on 4 patient identities (Black female, Black male, White female and White male) simulated using MorphFace. Results showed facial expressions rated as most appropriate by all participants comprise a higher rate of change and shorter delay from upper face AUs (around the eyes) to those in the lower face (around the mouth). In contrast, we found that transient parameter values of most appropriate‑rated pain facial expressions, palpation forces, and delays between palpation actions varied across participant‑simulated patient pairs according to gender and ethnicity. These findings suggest that gender and ethnicity biases affect palpation strategies and the perception of pain facial expressions displayed on MorphFace. We anticipate that our approach will be used to generate physical examination models with diverse patient demographics to reduce erroneous judgments in medical students, and provide focused training to address these errors.

Follow Us on

GET YOUR DAILY NEWS DIRECTLY IN YOUR INBOX

Stay ahead with the latest science, technology and innovation news, for free:

By subscribing, you agree to our Terms of Use and Privacy Policy. You may unsubscribe at any time.