It is a breakthrough moment for neuroprosthetics.
A team of scientists from the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland has combined human control and AI robotics to improve prosthetics' movements — a world-first for this method of neural prosthetics.
Their work was published in Nature Machine Intelligence in September.
What is neuroprosthetics?
The formal term is neural prosthetics. These types of prosthetics stimulate a person's nervous system through electrical stimulation to make up for deficiencies that get in the way of general motor skills. These can include cognition, hearing, vision, communication, or sensory skills.
This new discovery is a huge jump forward in the world of prosthetics. In the U.S. alone, approximately two million Americans are amputees, with around 185,000 amputations carried out each year.
According to a report by the National Limb Loss Information Center, most amputations are caused by vascular diseases. The disease accounts for 82% of amputations.
The need for prosthetics in such situations can help individuals. If neuroprosthetics study can move as close to a regular human limb as possible, it would change the way many amputees live, making their life much easier.
The EPFL team and their neuroprosthetic
Currently, there are commercial prosthetics, also known as myoelectric prosthetics, that can move thanks to links to the user's muscle movements. However, the dexterity of a prosthetic hand is nowhere near that of an intact human hand.
The EPFL researchers wrote, "While intuitive, the system provides little dexterity. People abandon myoelectric prostheses at high rates, in part because they feel that the level of control is insufficient to merit the price and complexity of these devices." So they took matters into their own hands.
The research team combined neuro-engineering, robotics, and artificial intelligence to semi-automate a part of the motor command for 'shared control.'
Then, the team honed in on the design of the software algorithms. This included robotic hardware consisting of an Allegro Hand mounted on a KUKA IIWA 7 robot, an OptiTrack camera system, and TEKSCAN pressure sensors.
Then, the team created a multilayer perceptron (MLP) in order to learn how to read the user's intention.
Katie Zhuang, first author of the study and from the EPFL Translational Neural Engineering Lab said, "For an amputee, it’s actually very hard to contract the muscles many, many different ways to control all of the ways that our fingers move."
Zhuang continued, "What we do is we put these sensors on their remaining stump, and then record them and try to interpret what the movement signals are. Because these signals can be a bit noisy, what we need is this machine learning algorithm that extracts meaningful activity from those muscles and interprets them into movements. And these movements are what control each finger of the robotic hands."
Combining AI with neuro-engineering and robotics is a new approach, and the EPFL team has hugely advanced neuroprosthetic technology.