This Robot Can Be Controlled by Brain Signals and Hand Gestures

Brain reading robots are here, thanks to breakthrough work by MIT researchers.
Jessica Miley

Scientists from MIT have developed a new way for humans to train robots using brain signals and body gestures. Developing robots to do specific and precise tasks requires a huge amount of programming based around the human language.

But now this new technique means robots can be controlled and trained using unconscious brain signals and intuitive hand gestures. The team responsible for the breakthrough developed a way to harness brain signals called "error-related potentials" (ErrPs), which unconsciously occur when people observe a mistake.

System uses unconsciously generated brain signals

The system works by monitoring the brain activity of a person observing a robot at work, if an ErrP occurs because the robot made a mistake, the robot is notified and pauses to wait for a correction from its human observer. The observer can correct the mistake via simple hand gestures that the robots understand through an interface that monitors muscle activity.

On the accompanying video, you can see a robot called ‘Baxter’ moving a power drill to one of three possible targets. When the robot moves to the wrong target, their ErrP's signals cause the robot to pause. 

The human observer then moves their wrist to indicate in which direction and for how far the robot should move their drill. With human supervision, Baxter was able to increase its accuracy from 70 percent to 97 percent.

“This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we've been able to do before using only EEG feedback,” says CSAIL Director Daniela Rus, who supervised the work. “By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”

The brain signals are picked up by an electrode covered cap using the power of electroencephalography (EEG). The muscle activity is read using electromyography (EMG) via series of electrodes on the users’ scalp and forearm.

[see-also]

While both of these technologies have individual problems, mainly to do with an accuracy in detection, when combined they provide a highly robust system. “By looking at both muscle and brain signals, we can start to pick up on a person's natural gestures along with their snap decisions about whether something is going wrong,” says the project's lead author Joseph DelPreto.

“This helps make communicating with a robot more like communicating with another person.” Excitingly the system is plug and play, meaning any user can be connected to the robot without extensive re-training.

Intuitive system opens doors to applications

DelPreto says the new system is particularly important as users don’t need to be trained to think in a particular way, the brain signals happen unconsciously and the gestures are intuitive and resemble what might happen if a human was training another human."The machine adapts to you, and not the other way around," he said, adding that the system "makes communicating with a robot more like communicating with another person."

The new robotic system has lots of potential application in scenarios where humans and robots will work closely together. It might also be useful in situations where humans have limited speech or movement such as for robots assisting the elderly.