The trick to advancing robot swarm technology for AI may be found in video games.
Or so it seems to the University of Buffalo's (UB's) artificial intelligence (AI) institute, which landed an incredible $316,000 federal grant to research the decisions people make — specifically, the biometric information revealed via brain waves and eye movements — while playing a video game.
Video games train tactical AI
Data acquired from the experiments will help create artificial intelligence that researchers believe may improve coordination among teams of autonomous air and ground robots.
"The idea is to eventually scale up to 250 aerial and ground robots, working in highly complex situations. For example, there may be a sudden loss of visibility due to smoke during an emergency. The robots need to be able to effectively communicate and adapt to challenges like that," said Souma Chowdhury, the grant's principal investigator.
Other co-investigators include David Doermann, director of the university's artificial intelligence institute and innovation professor of computer science and engineering at SUNY Empire, Eshan Esfahani, associate professor of mechanical and aerospace engineering, and Associate Professor Karthik Dantu.
Research into swarm robotics owes inspiration to a wide scope of natural organisms, like ant colonies and schooling fish. But however fascinating these creatures may be, only humans have the potential to improve AI systems to new tactical heights, according to Chowdhury, a member of UB's sustainable manufacturing and advanced robotic technologies (SMART) community of excellence.
Real-time strategy games become real-time robot warfare
The study is funded by the Defense Advanced Research Projects Agency (DARPA), and will focus on real-time strategy games. Unlike turn-based games, these are time-based, and require the thoughful allocation of resources to consruct units — and ultimately devastate — opponents. Think StarCraft, Stellaris, and Company of Heroes.
But the researchers needed something simpler, and more basic, so they designed a real-time strategy game of their own. Students' decisions are recorded in two ways: through eye movements via high-speed cameras, and brain activity via electroencephalograms.
The team of researchers will eventually use this data to curate new artificial intelligence algorithms that guide the behavior of autonomous ground and air robots.
"We don't want the AI system just to mimic human behavior; we want it to form a deeper understanding of what motivates human actions. That's what will lead to more advanced AI," said Chowdhury.
One step from drones and far from Skynet
With sufficient data, the team plans to integrate and evaluate the artificial intelligence it develops into more complex virtual environments, developed by DARPA's partner organizations.
"This project is one example of how machine intelligence systems can address complex, large-scale heterogeneous planning tasks, and how the University of Buffalo Artificial Intelligence Institute is tackling fundamental issues at the forefront of AI," said Doermann.
This is far, far from Skynet. On a vertical spectrum between that and the flying drones introduced during the Obama administration, AI robot swarms are probably one tiny step up from the latter. And whatever global conflicts await us in the future, AI robot swarms with the minds of gamers may be at our backs.
Preferrably, not targeting them.