Bio-Inspired Drones Read Textures to Improve AI Vision
Flying insects such as honey bees fly from flower to flower and weave between obstacles via a method called optical flow: they perceive the speed of objects moving through their field of vision.
Roboticists have tried to mimic this method in small autonomous drones with little success so far.
A team of researchers from TU Delft and the Westphalian University of Applied Sciences developed a new, more effective optical flow-based learning process that estimates distances using the shape, color, and texture of objects. Their study is published in Nature Machine Intelligence.
Programming optical flow into drone sensors
The research teams' new artificial intelligence (AI)-based learning method improves the navigation skills of small drones.
As small flying drones are much more restricted in terms of the sensors they have onboard than, say, an autonomous car, it is important that they utilize an extremely efficient type of artificial intelligence.
"Our work on optical flow control started from enthusiasm about the elegant, simple strategies employed by flying insects," Guido de Croon, professor of Bio-inspired Micro Air Vehicles and first author of the article, explained in a press release.
"However, developing the control methods to actually implement these strategies in flying robots turned out to be far from trivial. For example, our flying robots would not actually land, but they started to oscillate, continuously going up and down, just above the landing surface."
Serious limitations of optical flow robotics
Unfortunately, optical flow in small drones has some serious limitations. Perhaps the worst of these is the fact that obstacles in the direction in which the drone is moving are usually obscured by noise. In other words, the obstacles the drone is most likely to hit are the ones that are the hardest to detect.
"We realized that both problems of optical flow would disappear if the robots were able to interpret not only optical flow, but also the visual appearance of objects in their environment," adds Guido de Croon. "This would allow robots to see distances to objects in the scene similar to how we humans can estimate distances in a still picture. The only question was: How can a robot learn to see distances like that?"
Texture as a marker for distance
The team found that using readings of differences in textures of outdoor objects at different distances led to a much smoother landing for their small drones.
"Learning to see distances by means of visual appearance led to much faster, smoother landings than we achieved before," says Christophe De Wagter, researcher at TU Delft and co-author of the article. "Moreover, for obstacle avoidance, the robots were now also able to see obstacles in the flight direction very clearly. This did not only improve obstacle detection performance, but also allowed our robots to speed up."
The new method will be particularly relevant for smaller drones with fewer resources, the researchers explained. The team's research has the potential to greatly improve the efficiency of small drones used in greenhouses, as crop monitors, or as inventory and stock trackers.