A new brain-computer interface enables a paralyzed 37-year-old to communicate "effortlessly"
A brain-computer interface, developed by researchers at the University of Tubingen in Germany, has allowed a 37-year-old man fully paralyzed man to communicate with his family, Live Science reported.
What is a brain-computer interface?
A brain-computer interface is a system that acquires brain signals, analyses them, and then converts them into commands that can be relayed over an output device. A common example of such as system is Neuralink which has enabled experimental monkeys to play computer games without using a joystick.
While watching a monkey play pong using his mind might be exciting, the main aim of the technology is to improve the quality of life of individuals who have lost critical functions of their body due to disease or accidents.
Get more updates on this story and more with The Blueprint, our daily newsletter: Sign up here for free.
A 37-year-old man, called patient K1, is affected with Lou Gehrig's disease, also known as amyotrophic lateral sclerosis (ALS). It's a condition where individuals gradually lose the ability to control the muscles in their bodies. Theoretical physicist Stephen Hawking was also diagnosed with this condition that saw his motor control deteriorate to a point where he needed the help of an augmentative and alternative communication (AAC) device.
While these devices have evolved over the years, they still need the individual to retain some amount of muscular control either in the eyes or facial muscles to continue using them. In the case of ALS, the condition of patients continues to deteriorate as they lose control of muscles all over the body, called the "completely locked-in" state.
The team of researchers at Tubingen led by Dr. Niels Birbaumer have developed a brain-computer interface that uses auditory neurofeedback to help individuals even in a "completely locked-in state" communicate.
K1, who was diagnosed with his condition in 2015, lost the ability to walk later that year. He began using an eye-tracking AAC device in 2016 but his ability to fix his gaze the following year. The family used their own method to communicate yes and no responses depending on eye movements, but that was soon lost as well.
How does it work?
In 2019, the researchers implanted two microelectrodes into the patient's brain and began using auditory feedback to train the device. In this method, K1 had to match the frequency of his brain waves to certain tones, words, or phrases and hold it for a brief period of time for the system to register it.
A little over three months after the implant, K1 could pick letters, words, and phrases and even spelt out to the researchers, "it (the device) works effortlessly." The interface allowed him to communicate with his family, using motor areas of his brain even though he retains no motor function in his body whatsoever.
The system is far from perfect and runs the risk of getting stuck in a loop and requires that it be used under supervision. The researchers are also working on an improved version of the system that does not need an external computer to function. The system is currently under prevalidation.
The interface might not be readily available but you can read more about the research in the journal Nature Communications.
Patients with amyotrophic lateral sclerosis (ALS) can lose all muscle-based routes of communication as motor neuron degeneration progresses, and ultimately, they may be left without any means of communication. While others have evaluated communication in people with remaining muscle control, to the best of our knowledge, it is not known whether neural-based communication remains possible in a completely locked-in state. Here, we implanted two 64 microelectrode arrays in the supplementary and primary motor cortex of a patient in a completely locked-in state with ALS. The patient modulated neural firing rates based on auditory feedback and he used this strategy to select letters one at a time to form words and phrases to communicate his needs and experiences. This case study provides evidence that brain-based volitional communication is possible even in a completely locked-in state.