‘Bond of trust’ can see humans and robots working together, says AI expert
A prominent engineer in AI claims humans and robots can work together peacefully if they can build a “bond of trust.” The claim is a far cry from the doomsday scenarios painted by many experts in the field.
Tariq Iqbal, an assistant professor of systems engineering and computer science in the University of Virginia’s School of Engineering and Applied Science, says he strives for machines to work with people, not replace them.
This is according to a press release by the institution published on Tuesday.
“The overall goal of my research is how can we build a fluid and efficient human/robot team where both human and robot can share the same physical space, materials and environment,” Iqbal said. “The hypothesis is that by doing so, we can achieve that which neither the human nor the robot can achieve alone. We can achieve something bigger.”
To do this, humans and robots must build “a bond of trust,” argues the expert.
“Can a robot understand how much trust the human counterpart is putting into it?” Iqbal said. “Can it calibrate that trust? If I’m a factory worker and there is a robot, but I’m under-trusting it and thinking that the robot cannot perform the task, then I will not utilize that robot as much as I could.”
That’s why trust is crucial.
“It happens in human teams and group settings,” Iqbal said. “Whenever I delegate something, I trust that the human in my team can do it.”
But there are obstacles to achieving this lofty goal, one of which is getting robots to understand human expression.
“Whenever we try to build something to understand human behavior, it always changes,” Iqbal said. “Understanding the human intent is so hard of a problem itself, because we are expressing ourselves in so many immense ways, and capturing all those is very hard. In many cases, every time we learn something new, it’s hard to teach the machine how to interpret the human intent.”
This is partially because human expression has such a wide variety of forms.
“Whatever I’m saying is not just the message that I’m passing,” Iqbal said. “I’m passing a lot of my messages with my gestures, so just understanding the verbal message is not sufficient. If I am saying, ‘Give me that thing,’ from just the audio, there is no way for you to know which thing I’m referring to because I’m referring to some objects with my hand gesture.”
As such Iqbal’s robots are trained with something called “multimodal representation learning,” a system of verbal messages; nonverbal gestures and even human physiological sensing.
Iqbal says this approach may allow robots to better support humans.
“The whole goal or the purpose of the robot should be to support the human. It’s not replacing the human in any means,” Iqbal said in the statement. “That means our goal should be to find out where the human needs support and build a robot to help the human in those dimensions.”