The June issue of Science Robotics — one of the most important journals in the field — is dedicated to a single topic: robotic skin. The studies inside explore futuristic-sounding materials that will give devices the ability to bend, stretch, and sense in ways that machines never could before. One article even describes an artificial skin made of human skin cells (read more below). This issue is a rare peek into a future that will look very different from our present.
Interesting Engineering recently sat down with neuroengineer Luke Osborn, a senior researcher at the famed Applied Physics Laboratory at Johns Hopkins University in Maryland. Osborne is an expert in systems that return sense perception to people who use prosthetics. In 2019, he earned a spot on Forbes' 30 Under 30 for developing a skin that can "recreate a sense of pressure and pain." We asked Osborne to bring us up to speed on robotic skin and explain the challenges that neuroengineers are working to overcome.
This interview has been edited for length and clarity.
Interesting Engineering: Why is a neuroengineer working on robotic skin?
Luke Osborn: We're exploring how to connect humans and machines together, usually with the purpose of either restoring some function or making it easier or better for somebody to do something. For example, with prosthetic limbs that get attached to a human. So that's kind of the big picture.
More specifically, I focus on syncing and restoration of touch and sensations to individuals. So, somebody who has maybe lost a limb from amputation and is using a prosthetic arm. How do we get information about what it is they're touching back to them? We have to build the sensors. But sending that information back to the human can also be tricky. My field is in this neuro engineering space with a focus on sensation and perception for humans and robots.
IE: What are some of the major challenges you and your colleagues face in creating these artificial skins?
Osborn: There are big challenges in creating sensors that are able to measure and detect what's going on in the environment, generally through touch. The challenge is because we have this expectation that we can create sensors that can do everything humans can do, and we're not quite there yet. In fact, we're a long way off because the skin is able to do so much. There are so many receptors in the skin that measure things like temperature and pressure and vibration. So, historically speaking, creating a sensor to capture all the information that human skin can capture is extremely challenging.
There are other pieces too. How do you make those sensors soft and compliant so that they fit onto a robot if the robot is moving around? And then there's another piece: How do we take that information and now send it back to either the robot so the robot knows what's going on, or to the human so the human can feel what's going on.
IE: What kind of progress have researchers made in recent years?
Osborn: We're still not able to replicate what the human skin does, but there have been some really fascinating engineering advancements that are giving robots the ability to sense and perceive things that we might not typically think about. From the perspective of rehabilitation, we're thinking about how we restore and mimic what the skin does so that we can feel what's going on in the world. But there's this interesting engineering challenge of that as well. If we can't exactly recreate it, are there other things that we can do to make these sensors better or more functional and extract more information from what it is robots are interacting with than you might be able to with human skin?
IE: A paper published earlier this month in the peer-reviewed journal Matter blurs the line between human skin and robot skin. What did you think of that study?
Osborn: It's pretty wild. This is really exciting because they basically created an artificial skin for a robot that's made from living cells, which is pretty unique. That's not something that we typically think about. We think of having robots be these artificial mechanical systems that aren't necessarily human-like.
One challenge with robotic systems that contain sensors is when those sensors get damaged or cut up or something like that, they don't have a way of healing themselves. So, this paper is pretty interesting, because it takes a unique approach to using living skin to build what's essentially a substrate that can go over the robotic finger. Skin does a lot more than just letting us feel what we're touching. It also helps protect the insides of our body. The same could be true for a robot where you may want to have some protective skin on it to shield it from other environmental factors and things like that.
Having an artificial skin like this, that is able to heal on its own or with the help of living cells, is pretty exciting because you're now potentially creating something that is able to recover from damage.
IE: How do the engineers behind systems like this avoid common problems like corrosion when blending living tissues and electronic elements? Are biocompatible materials advanced enough to avoid that kind of problem?
Osborn: I'm not sure, but I'll say I'm optimistic that there will be ways to start to incorporate electronic elements to do some of the sensing to help push this even further. I don't know exactly what that interaction would look like, and it's definitely something that would need to be explored. But I would say that I'm optimistic that it could be done, and it seems like the team is forward-looking in that sense.
The future would be to start to include sensors within this living skin so that it has not only the ability to protect and conform nicely and move with this robotic finger, but also provide some information about what it's detecting or what it's feeling when it's touching.
IE: Do you expect this kind of hybrid system to be the future of robotic skins?
Osborn: it's hard to predict what the future looks like or why one approach may be better than another. We're in an exciting time where there are a lot of new things happening with artificial skins to make them better and smarter and more functional. Not only the artificial skin — the electronic skins themselves — but also be using their capabilities to make robots that are smarter.
IE: A paper in the current issue of Science Robotics reports a method for sensing that uses electrodes and microphones to measure vibrations and other stimuli. How does that fit into the picture of electronic skins?
Osborn: I loved seeing this approach of using things like acoustic tomography for sensing. This unique approach is essentially listening for what's happening at the surface of the artificial skin to then give some indication about the type of touch that's happening. There are a lot of really great examples of artificial skins that exist, including some of the stuff we've worked on.
One of the many challenges is being able to effectively interpret what it is that an artificial skin is sensing. In general, that can be challenging, especially if you think about how complex the types of sensations are that we receive from our own skin. So things like different gestures and different distributions of pressure are all integrated together to give us the perception of what it is we're feeling.
Here, they take an interesting approach where they basically use electrical impedance measurements. So they're measuring, the electrical activity through this soft medium — this soft artificial skin — to figure out how hard you're pressing. They're also able to use microphones to pick up vibrations because as you touch and move across your skin, you're generating these small vibrations.
They're able to then show that they're able to measure that, which is cool because then it lets them do things like tell if you're patting or tickling or are you moving your fingers in a unique way, and essentially track that movement across the artificial skin. To me, that's exciting because it opens up some possibilities in being able to more finely detect and understand what it is that this artificial skin is experiencing. That will be useful for robots and prosthetic limbs.
IE: Is that analogous to the way any of the sensory systems in human skin work?
Osborn: I would stop short of saying that it mimics it, but I would say that some of these new advancements seem to be heavily inspired by the way our own skin receptors work, specifically by detecting different aspects of touch. For instance, we have receptors in our skin that are more sensitive to an indentation or pressure, and we have other receptors that might be more sensitive to vibrations across the skin. When the wind blows across your skin, there are receptors that are more sensitive to that.
What's particularly exciting is that we're able to draw inspiration from some of these biological mechanisms and put that into these artificial skins. For instance, here, the ability to detect vibration — even though they're using a microphone — is not entirely dissimilar from what you might expect some receptors in the skin to be doing.
They obviously behave very differently and respond to different things, but the idea here is that we're starting to explore ways these artificial skins can extract more information and capture more information about what's being touched. It seems that there's a lot of inspiration being drawn from these biological receptors that we see naturally occurring in human skin.
IE: What do you think the future of artificial skin is going to look like?
Osborn: I don't necessarily think that we'll be able to fully replicate all of the sensitivities and functionality that we see in human skin. But I think we're going to be able to create pretty unique solutions that are potentially able to go beyond what we might think human skin is typically able to do.
So, basically creating artificial skins that go beyond what we currently think about what the limits in sensing and function should be. As humans, we can feel things like pressure, temperature, and the wind across our skin. But with some of these artificial skins, you can start to incorporate novel sensing modalities, such as being able to detect chemicals in the air.
I don't think we're going to necessarily, one-for-one replicate human skin and human skin sensitivity and be like, "Alright, we're done, let's move on." I think what's more exciting is figuring out ways to incorporate novel sensing modalities into artificial skins that give robots and prosthetic limbs even more information about the environment than we're necessarily thinking about right now.