Neural engineering is at the forefront of the human relationship with technology. Using virtual reality technology, researchers are focusing on increasing the level of control available for users of prosthetic limbs.
[Image source: ASU]
Arizona State University’s Neural Engineering Lab is developing prosthetic technology with outstanding new levels of maneuverability. Headed by Associate Professor Bradley Greger, the research team are focusing on how the brain interacts with prosthetic limbs.
‘It’s not just telling the fingers to move. The brain has to know the fingers have moved as directed, Greger told ASME.
To achieve this, researchers collected a month’s worth of data from an electrode array, implanted into the nerves in the arms of two amputees. They then examined the neural activity recorded during amputees’ use of virtual reality technology to control ghost digits.
‘It’s not like we need any fundamental breakthrough,’ Greger continued. ‘We need some good engineering and sufficient resources. The issue is a robustly engineered electrode of the right materials that is also compliant. It has to be a little bit more biological. It’s got to move and shift and be flexible like the nerve that it is interfacing with. A lot of the approaches have come from an electrical engineering background where they approach it from a rigid circuit connector.’
Humanizing the technology is a priority for the team.
‘The real exciting opportunity is to think about the neural code not as one-to-one mapping when I move my index finger, but when I do this whole kind of posture with my hands, there is real synergy. The challenge is how to get the neural signal that’s operating in full synergy to talk to a mechanical device that’s set up to also move with the synergies.’
Interpretation of the data will lead to the development of an inbuilt neural decoding system, leading to more natural use of prosthetic limbs.
‘There will be some learning curve because we will be introducing them to a fairly complex system that listens to the nerve and takes those signals that used to control the hand – now gone – and use them to control the prosthetic hand,’ Greger said. ‘We are hopeful that it will be more intuitive and [it is] very important that there is some sensory feedback so when they touch something, they get some sense that they touched something. That will really help them have a sense of embodiment. It really becomes like ‘their hand’.’
‘We’re working toward limbs that are accessible both financially and in terms of usability… something that is maybe not quite as sophisticated [as the expensive ones] but certainly better than the current generation of prosthetic hands.’
You can read the team’s findings in their Journal of Neural Engineering paper.