For several years now, autonomous drones have been trialed to detect signs of life in disaster areas. Now, in a first-of-its-kind study, researchers from Adelaide and Iraq are going one step further.
The engineers, from the University of South Australia and Middle Technical University in Baghdad, have designed a computer vision system that is able to distinguish between deceased bodies and survivors from 4-8 meters away.
Detecting signs of life
With all of the work going into detecting life on Mars you'd think that detecting human life would be a walk in the park. Disaster areas are notoriously difficult to search through, leading experts to look toward tech solutions like drones to help in the endeavor.
The new system, tested by the Adelaide and Baghdad scientists, works as long as a person's upper torso is visible. If it is, the drone's cameras can pick up tiny movements in the individual's chest cavities, and measure heartbeat and breathing rate.
Previous systems relied on less precise readings such as skin color change and body temperature.
Other existing techniques, such as using thermal cameras, are only able to detect signs of life when there is a strong contrast between body temperature and the ground. This makes it difficult to detect vitality in warm environments. In cold environments, insulated clothing can also get in the way of detection.
The new tests build on previous work by the same group of engineers. In 2017, they showed that a camera on a drone could successfully measure heart and respiratory rates.
However, the system could only detect signs of life in people who were standing — meaning it was very clearly an early prototype.
Helping first responders
UniSA Professor Javaan Chah says the new technology could be used effectively in disaster zones where time is critical, helping first responders to look for survivors.
"This study, based on cardiopulmonary motion, is the first of its type and was performed using eight people (four of each gender) and a mannequin, all lying on the ground in different poses," Chahl said in a press release. "Videos were taken of the subjects in daylight, up to eight metres away, and in relatively low wind conditions for one minute at a time, with the cameras successfully distinguishing between the live bodies and the mannequin."
Though it is an improvement on previous versions, Chahl says the drone-integrated motion-based system needs additional testing. For example, in harsh weather conditions or in situations where a person's upper torso is partially covered.
However, it is another step towards a faster response in situations where a difference of seconds can save a life.