A claim that Mohsen Fakhrizadeh, a top nuclear scientist from Iran, was shot dead by a satellite-controlled machine gun using artificial intelligence (AI) is raising eyebrows.
A Revolutionary Guards commander, Birg-Gen Ali Fadavi, made the statement following Fakhrizadeh's death on November 27. Fakhrizadeh was traveling in a car convoy when he was hit 13 times, but his wife, who was sitting inches away from him, was left untouched, reports the BBC.
At this point, the claim has not been verified, and should be taken with a large pinch of salt.
Several assumptions over how Fakhrizadeh was killed have arisen. On the day of the attack, the Iranian defense ministry said that there had been a gunfight between his bodyguards and a number of gunmen.
Another report stated that a Nissan pick-up truck exploded at the scene.
Then, on the day of Fakhrizadeh's funeral, the head of Iran's Supreme National Security Council explained that it was a remote attack, which used "special methods," and "electronic equipment." No other details were shared.
And last Sunday, Brig-Gen Fadavi allegedly said that a machine gun that was positioned in a nearby Nissan pick-up was "equipped with an intelligent satellite system which zoomed in on martyr Fakhrizadeh" and "was using artificial intelligence."
This is a worrying claim, and again, has yet to be verified.
Using AI in conflict is not a new worry for scientists, and would indeed require a very strong force to carry out such a mission.
Professor Noel Sharkey, a member of the Campaign Against Killer Robots told the BBC that, "If such devices were autonomous, using face-recognition to pinpoint and kill people, we would be on a downhill roll that would entirely disrupt global security."
It would, indeed, be a very different type of warfare if AI were used in such a way. AI is certainly being developed to assist military communication methods, but it has not yet been used to shoot down people — as far as we're aware.