What if a robot could detect deformations such as pressure, bending, and strain? That would go a long way in making it more human. But that's not a faraway dream anymore, thanks to ever-growing technology and science.
Cornell researchers have created a fiber-optic sensor that indeed detects these deformations opening the possibility of giving soft robotic systems, as well as any human using augmented reality tools, the ability to feel the same sensations as any mammal.
“We know that soft matters can be deformed in a very complicated, combinational way, and there are a lot of deformations happening at the same time,” doctoral student and paper co-author Hedan Bai said. “We wanted a sensor that could decouple these.”
The novel sensors combine low-cost LEDs and dyes into what the researchers refer to as a stretchable “skin.” To engineer their new robotic skin, the scientists were inspired by silica-based distributed fiber-optic sensors.
These sensors have the ability to detect minor wavelength shifts resulting in the identification of multiple properties including but not limited to humidity, temperature, and strain. Bai and his team called this new invention stretchable lightguide for multimodal sensing (SLIMS) and it introduced a whole new dimension to touch.
“Right now, sensing is done mostly by vision,” said Rob Shepherd, associate professor of mechanical and aerospace engineering in the Cornell College of Engineering.
“We hardly ever measure touch in real life. This skin is a way to allow ourselves and machines to measure tactile interactions in a way that we now currently use the cameras in our phones. It’s using vision to measure touch. This is the most convenient and practical way to do it in a scalable way.”
Even better, the new design is not limited to robotics. Its inventors are now looking into ways it can now enhance virtual and augmented reality experiences.
“Let’s say you want to have an augmented reality simulation that teaches you how to fix your car or change a tire. If you had a glove or something that could measure pressure, as well as motion, that augmented reality visualization could say, ‘Turn and then stop, so you don’t overtighten your lug nuts.’ There’s nothing out there that does that right now, but this is an avenue to do it," concluded Sheperd.