Rise of the machines

Are killer robots about to take over the world? Here’s what the experts have to say.
Alice Cooke
Killer robots 568px.jpg
Killer robots - it could happen.
  • Killer robots have long been the subject of movies and novels, but are they now real?
  • There are already people mounting guns on clones of Boston Dynamics’ Spot robot dog
  • What’s more, there are experts across the world who think it’s a very real threat – and there’s even a campaign to stop it
Rise of the machines
Killer robots: Interesting Engineering

Talk about life imitating art… Word on the street has it that Russian spies used to watch the latest Bond movie to see what technologies might be coming their way. Killer robots et al.

No, really. This is according to Robert Wallace, former head of the CIA’s Office of Technical Service (and the US equivalent of MI6’s fictional character Q of Bond franchise fame.)

Fact or fiction?

But are killer robots actually a thing beyond fiction? (If you’re struggling to recount any, think HAL in Kubrick’s 2001: A Space Odyssey, Fritz Lang’s 1927 classic Metropolis, and Arnold Schwarzenegger’s T-800 robot in the Terminator series.)

The answer, it would seem, is possibly.

I mean, when you think about it, it makes sense. We know that increasingly, in the theatre of war, the decisions to identify, track and destroy targets are handed over to AI and algorithms.

And, you could argue, this is taking the world to a dangerous place, with a host of moral, legal, and technical implications.

Turkey is already emerging as a major player in the drone power stakes …you can’t tell me they’re the only ones. And that’s just drones.

Are we about to cross a moral red line into a world where unaccountable machines decide who lives and dies?

Robot manufacturers think so. In fact, it’s such a real possibility that they’ve made a pledge they won’t weaponize their inventions. Well, more accurately, some of them have.

Rise of the machines
Killer robots

The pledge

Just this month, six leading robotics companies pledged they would never weaponize their robot platforms. The companies in question include Boston Dynamics, which makes the Atlas humanoid robot, which can perform an impressive backflip, and the Spot robot dog, which looks like it’s straight out of the Black Mirror TV series.

So far, so unthreatening... Right?

But the point is that if they wanted to build a killer robot, they could. The fact that they’ve signed a pledge to say they won’t proves that.

And this isn’t the first time robotics companies have spoken out about this potentially worrying predicament – no, no. Five years ago, an open letter was signed by Elon Musk and more than 100 founders of other AI and robot companies calling for the United Nations to regulate the use of killer robots. The letter even knocked the Pope into third place for a global disarmament award.

Believe me yet?

However, the fact that leading robotics companies are pledging not to weaponize their robot platforms is (possibly) virtue signaling more than anything else. Surely. (Or is that my cynical opinion? I’ll leave it up to you to decide.)

In fact, to further prove my point, there are already people mounting guns on clones of Boston Dynamics’ Spot robot dog. And such modified robots could be very effective in action.

Furthermore… some of you may remember that Iran’s top nuclear scientist was assassinated by Israeli agents using a remote-controlled robotic machine gun in 2020. Just saying.

The view from the experts

So, what do the people in the know have to say about it?

Paul Scharre, a senior fellow and director of the Technology and National Security Program at the Center for a New American Security. (Which is a Washington, D.C.-based national security think tank that's an independent bipartisan research organization) says: “I'd say that almost all major military powers are racing forward to invest in more robotics and autonomous artificial intelligence.

“I think for many of them, they have not yet made a decision whether they will cross the line to weapons that actually choose their own targets, to what I would call an autonomous weapon. I think for a lot of Western countries, they would agree that there's a meaningful line there. They might parse it in different ways.”

Mary Wareham of Human Rights Watch adds fuel to the fire by saying: “We've articulated legal concerns, but there are much broader concerns here that we're also worried about, too. This notion of crossing a moral line and permitting a machine to take human life on the battlefield or in policing or in border control and other circumstances, that's abhorrent, and that's something that the Nobel Peace Laureates, the faith leaders and the others involved in the Campaign to Stop Killer Robots want to prevent. For them that's a step too far.”

Yes, you read that right… there’s a Campaign to Stop Killer Robots. So clearly there’s a significant body of people who think this is a very real threat to life as we know it.

Wareham continues: “They also worry about outsourcing killing to machines. Where's the ethics in that? Then, what impact is this going to have on the system that we have in place globally? How will it be destabilizing in various regions, and, as a whole, what will happen when dictators and one-party states and military regimes get ahold of fully autonomous weapons? How will they use them? How will non-state armed groups use them?”

So, what are we doing about it?

Since 2018, the United Nations Secretary-General António Guterres has repeatedly urged states to prohibit weapons systems that could, by themselves, target and attack human beings, calling them “morally repugnant and politically unacceptable.”

Several dozen nations have already called on the UN to regulate killer robots. The European Parliament, the African Union, Nobel peace laureates, church leaders, politicians, and thousands of AI and robotics researchers have all called for regulation.

Interestingly, Australia has not supported these calls thus far. But perhaps they don’t see a need?

Last year, as reported by the New York Times, a majority of the 125 nations that belong to an agreement called the Convention on Certain Conventional Weapons, (CCW), said they wanted curbs on killer robots. But they were opposed by members that are developing these weapons …most notably the United States and Russia.

Make of that what you will.

Have I convinced you yet? Or do you think I should retreat to my bunker and put a sieve on my head? Only you can decide. But I would suggest there’s a body of evidence here that simply cannot be ignored.

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board