X Close

Killer robots will make war worse There are no winners when lethal violence is consequence-free

The Terminator is looking ever more prescient. Credit: YouTube

The Terminator is looking ever more prescient. Credit: YouTube


November 17, 2020   5 mins

Imagine a missile swooping low out of the dark sky, and exploding against the roof of a house. The family inside are instantly killed. Next door, however, are not so lucky and piercing screams soon fill the night; villagers rush to the front door breaking it down, and begin to carry the badly injured people into the street. As more and more people begin to gather outside, an army of InsectdronesTM target the crowd, swarming one-to-a-person, and self-detonating on impact.

Now imagine this was carried out by robots, without any human control. Scary? Yes, and it could be our reality in the next decade.

Britain’s most senior soldier, the Chief of the Defence Staff General Nick Carter, said earlier this month that “robot soldiers could make up a quarter of the British Army by 2030”. This is not idle speculation: autonomous systems (in effect, robots) make up a non-trivial part of the Ministry of Defence’s budget proposal that is currently under consideration by No 10. He went on to suggest that “I suspect we can have an army of 120,000, of which 30,000 might be robots, who knows,” Carter told Sky News.

If there is one thing guaranteed to cause speculation and reams of newsprint, it is killer robots. If something is depicted in science fiction and then features in real life, of course people get excited. The classic image for killer robots, of course, is from the Terminator series of movies — there, I’ve done it, I’ve mentioned Terminator; now I can get on with the article — but it is highly unlikely that future military robots are going to be humanoid-shaped cyborgs. These are only mechanically efficient on rough and broken terrain for one: ‘killer robots’ are far more likely to be drone-like, or tracked (or swarms of both).

So, how long before InsectdronesTM are patrolling our streets?

It is worth stating upfront that warfare over the next 20 years will be a story of increasing autonomy. This is not just true among the British, with Carter — a known reformer — at the helm. It is true of British allies, and potential adversaries too. The US plans to spend $1.7bn on researching autonomous systems (drones, robots and the like) in the next financial year. It is much harder to tell what China is spending on its military (or what autonomous systems it has under development), but it is already extensively using unmanned systems such as aerial drones which have a degree of autonomy built into them.

The reason that the world’s militaries are investing so heavily in this domain is pretty obvious. Weapons systems without humans are much, much faster in attack and defence, helping you get inside your opponent’s decision-making cycle (the so-called OODA loop). They can also be much smaller and more robust, allowing them to get places that humans can’t. They don’t tire or need feeding, they don’t have morale problems. Finally, and particularly appealing to democracies: you can launch actions without risk of casualties. In strict military terms, autonomous systems are a no-brainer.

But the lack of casualties is where the ethical problems begin. Imagine you are a leader, who has to make a decision about whether to launch a military strike. You would consider the benefits of military success and the chances of achieving that success (e.g. enemy target destroyed), against the costs of the mission and potential for failure (e.g. casualties, captured personnel, or making your country/military look weak). Using autonomous systems removes most of the downsides, which lowers the threshold for action — and so you are more likely to launch more attacks.

We have arguably seen this pattern already with the rise of unmanned aerial vehicles, or drones, over the past two decades. And it is not a good thing, even for the progenitors of the attacks, because it encourages easy, consequence-free use of lethal violence, which means that we tend to end up treating the symptoms of violence (rather than the causes), and sometimes even becoming a cause of further violence through the ubiquity of our own violence. Scholars now argue, for example, that drone strikes along the Afghan-Pakistan corridor create rather than eradicate suicide bombers, generating feelings of shame and humiliation among the targeted communities. In not-so-many words, these unmanned systems encourage us to be tactical rather than strategic.

But there is actually a much more profound problem: our entire ethical and legal systems are built upon human intentions and judgements.

Think about murder, for instance. In order to prosecute someone successfully for murder, one has to prove that the accused had the intention to kill the victim (this concept — mens rea or intent — is actually technically necessary for all criminal prosecutions). So too with the Law of Armed Conflict, which rests on four overriding principles: military necessity, distinction (between targets), proportionality and avoiding unnecessary suffering. And in applying these concepts under the law, the judgement and intent of the soldiers and officers involved is taken into account.

How do we apply these concepts to autonomous systems?

Imagine that the attack on the village described earlier was carried out instead by a helicopter under human control. Depending on the circumstances, it could be argued that the action did not draw enough distinction between military and civilian targets, or perhaps it was a disproportionate use of firepower for a minor military target. Or perhaps the intelligence was wrong.

Let us now say that this incident became the subject of a court case. This is not unrealistic; in fact there are tens, if not hundreds, of cases being brought against the Ministry of Defence for the conduct of British soldiers in Iraq and Afghanistan. In a court case, we might expect the commanders in charge to be questioned about their views on whether the attack was proportionate, or whether appropriate care was taken to distinguish between military and civilian targets. This is also not unrealistic — when dropping bombs on targets where there is a risk of danger to friendly troops, the pilot will ask for the ground commander’s initials as his or her acceptance of increased risk.

In short, most militaries go to great lengths to avoid breaking the Laws of Armed Conflict, and a key way of doing this is making one person responsible for the use of lethal force so that their judgment and intentions are on the line. And, fairly obviously, you cannot do this with autonomous systems. Who do we blame if it goes wrong? The “commander” of the robot? The robot itself? The person who wrote the algorithm? If innocent villagers die, who do we blame?

Going back to General Nick’s announcement, it was not made clear what roles these robots might play in the future British Army. Or whether, if they were able to deploy lethal force, humans would be kept ‘in-the-loop’ (i.e. humans pull the trigger – like a drone), humans would be kept ‘on-the-loop’ (i.e. humans can stop the robot pulling the trigger – like a heat-seeking missile once fired), or humans would be ‘out-of-the-loop’ (i.e. a fully autonomous system with no human oversight). This is probably because he doesn’t know.

These very real legal and ethical challenges are top of the in-tray for many countries: at the end of 2019 the US military issued guidance on the use of lethal autonomous systems that was notable for asking the US Congress to help find a way through the minefield, and explicitly putting the subject of arms control on the table for discussion (something I have written about here and here with other types of weapons).

The real problem with autonomous weapons systems comes when you put strict military expediency in the short term up against complex legal and ethical challenges that may only come to fruition in the longer term. What if, when facing a near-peer competitor, a country is attacked with fully autonomous systems — with all of the military advantages outlined above. Weapons systems with humans in the loop would be quickly overwhelmed. One can imagine in those circumstances of survival that the current positions of the UK and the US (and many other countries) — that humans should always exercise oversight, authority and judgement over lethal autonomous systems — might crumble rather rapidly.

This, rightly, terrifies everyone: warfare is a human endeavour, and it requires humanity in its prosecution. Without that, we all lose.


Mike Martin is a former British army officer and War Studies Visiting Fellow at King’s College London. His latest book is Why We Fight.

ThreshedThought

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

15 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
Simon Denis
Simon Denis
4 years ago

Who do we blame? You dodge the answer to your own question! We blame the person who started it; the person who launched the unprovoked strike, whether General, minister or terrorist. As for the wider issue, it is as old as war itself: military innovation briefly makes war more likely, precisely because it introduces an element of imbalance – the opposite of MAD. But the shortest way to imbalance is to convince an already spineless, guilt ridden west to disarm whilst China is obviously building up her formidable arsenal. Finally, thank goodness for robots! For another obvious imbalance is demographic: the wider world teams even as the west withers. Without robots how in heaven’s name would we defend ourselves? Gibbon’s strange retro-prophesy may yet be realised unless we have the strength and vigilance to maintain an up-to-the-minute military establishment.

GA Woolley
GA Woolley
4 years ago

This is a disappointingly one-dimensional and simplistic argument. For one thing, war in the next 20 years will be increasingly cyber, and even less human than robots. The attackers will be unattributable, and the targets will be hospitals, power stations, transport hubs and so on. The aim will not be conquest, but submission to demands – blackmail. There are ethical problems attached to these, but their users will disregard them. As far as robots go, warfare has been becoming more automated for many years, and already has ‘autonomous’ weapons systems. But ‘automated’ or ‘autonomous’ does not mean given free rein to go around killing and destroying whatever it decides to do. These systems are deployed to achieve specific aims, within given parameters, which constantly change according to the situation. A simple example is an air defence system. It detects and classifies potential targets, with strict rules over how it reacts to them. A slow, low target at a distance represents no immediate threat, so it may alert another part of the system, but that’s all. At the other extreme, if it detects a ballistic target tracking towards friendly forces it may be programmed to fire on it without human intervention. The same can be done for ‘drones’; 2 people strolling along a street are ignored, but the man 100 yards away raising an AK47 is engaged. But if I can correct the author’s final sentiment, war is an inhuman endeavour, and sacrifices humanity in its prosecution.

Stephen Murray
Stephen Murray
4 years ago

Yes, there was a lot of “humanity” in the trenches of WW1, wasn’t there?

Adrian
Adrian
4 years ago
Reply to  Stephen Murray

Because the gunner firing off shells couldn’t see the twisted face of the chaps he was blowing up.

Nun Yerbizness
Nun Yerbizness
4 years ago
Reply to  Adrian

as if that would have made a difference

Adrian
Adrian
4 years ago
Reply to  Nun Yerbizness

It does make a difference. Check out Col. Dave Grossman “On Killing”.
Obviously you won’t, but some readers might.

Nun Yerbizness
Nun Yerbizness
4 years ago
Reply to  Adrian

those costs come months and years after the fact and the deed is long done.

war is hell.

Oliver Johnson
Oliver Johnson
4 years ago

Another question I’d be asking – hinted at in the article itself – is how autonomous weapon systems would differentiate between civilians and enemy combatants, particularly if, as is so often the case now, the enemy looks like a civilian.

This debate in many respects – particularly in regards to drone strikes on the Afghan-Pakistani border – resembles the debate on ‘Air Control’ which occurred through the inter-war period during various colonial wars throughout the Empire.

I highly doubt however that Drones and robots will completely replace the need for actual boots on the ground. As Martin says, you need the human element else you’ll just create more animosity towards you. I can, however, see drones becoming an essential part of how we fight, becoming a sort of 4th element along with the boots on the ground, tanks and aircraft – what form that would look like I have no idea.

Nun Yerbizness
Nun Yerbizness
4 years ago
Reply to  Oliver Johnson

“I highly doubt however that Drones and robots will completely replace the need for actual boots on the ground.”

hang around for another decade or two and you will see for yourself the boots will be on the ground thousands of miles away from the devastation of the weaponry.

THE FUTURE OF UBIQUITOUS, REALTIME INTELLIGENCE: A GEOINT SINGULARITY

Imagine a future where realtime Earth observations with analytics are available globally to the average citizen, providing a tremendous wealth of information, insight, and intelligence. The opportunities seem immense, but what would the availability of ubiquitous, realtime intelligence mean to the military operator and warfighter.

https://aerospace.org/paper

Fraser Bailey
Fraser Bailey
4 years ago

Yes, it’s all horrifying, and reminds me one or two Philip K. d**k stories I read. But pretty much everything they have planned for us horrifying and there’s not much you can do.

M Spahn
M Spahn
4 years ago

Quite an oversight that you would write this article and make no mention of what just happened to Armenia. It could not be more relevant to your argument and it is in the news cycle right now.

Nun Yerbizness
Nun Yerbizness
4 years ago
Reply to  M Spahn

best drones win

Nun Yerbizness
Nun Yerbizness
4 years ago

I’m afraid”very afraid” that genie is out of the bottle and moving forward unstoppably.

johntshea2
johntshea2
4 years ago

Nice Science Fiction story, but:-

“They don’t tire or need feeding, they don’t have morale problems.”

But they do need fuelling, malfunction, and wear out.

Nun Yerbizness
Nun Yerbizness
4 years ago
Reply to  johntshea2

and then there is the software