Humans Could Use Echolocation to Navigate Like Bats and Dolphins
Echolocation is a technique used by bats, dolphins, and some other species to identify the location of targets using reflected sound, aiding them in hunting and communicating with each other -- and now, scientists in Japan have shown that humans may actually have echolocation skills that can be honed, Pop Mech reports.
This is not the first time echolocation has been explored in humans either; in fact, previous research has shown that some visually impaired individuals could be able to develop the technique to better understand spaces and enhance their navigation skills.
The new study, published in the journal PLOS ONE, proves humans can use location to identify the shape and rotation of different objects. Scientists were able to demonstrate this accomplishment in the lab thanks to a group of unpracticed volunteers who were able to recognize objects only by bouncing sounds off them.
The aim of the experiments was to see whether unpracticed, sighted individuals could use echolocation. A total of 15 people were recruited to differentiate between two 3D-printed cylinders of different geometries.
They couldn't see the objects, and they were told to make a high-frequency ringing sound with the help of a mobile unit to do so. Then, they would analyze the echo's characteristics to figure out which of the two cylinders was being targeted.
The echoes reflected back from the target objects were received by a sensor, which were then downscaled to one-eighth of their original pitch and transmitted to participants using headphones.
Turns out, the participants were able to identify which of the two cylinders was being targeted -- as long as the objects were rotating rather than stationary, though. Listeners were able to detect changes in the speed, tone, and timbre of the echo as the objects rotated because one of the cylinders had twice as many convex surfaces as the other.
But when the participants had to distinguish between two stationary cylinders, they were less successful due to the fact that lack of time-dependent changes in the nature of the echo made it more difficult to identify the geometric characteristics of each object.
But when the participants had to distinguish between two stationary cylinders, they were less successful due to the absence of time-dependent variations in the nature of the echo made it more difficult to classify the geometric features of each object.
But what does all this mean? Well, from helping you get by while camping to helping you navigate your house in the dark while suffering from late-night cravings, this ability to "see" in the dark could have numerous potential applications.
The researchers told Pop Mech that their study is "evidence that both humans and bats are capable of interpreting objects through sounds." Perhaps one day, we may able to expand this potential ability to interact with the world in a different way and see this skill being incorporated into tech such as wearables like watches or glasses, changing how people with visual impairments experience the world.
Advancing smart dust concepts is inhibited by a lack of equally small on-chip power sources that can function anytime and anywhere. Could this microbattery the size of a grain of salt be the solution?