Quantcast
Channel: Monthly Conundrums – Marianne Talbot Philosophy
Viewing all articles
Browse latest Browse all 20

Drones, Robots and the Ethics of Armed Conflict in the 21st Century

$
0
0

Wow!

Robo Wars (The Oxford Martin School)

Robo Wars (The Oxford Martin School)

Did you know there are machines out there capable, once programmed, of searching out their target, and delivering their lethal ‘payloads’ without further human intervention? In other words the ‘decision’ to kill belongs to the machine itself.

Alex Leveringhaus argues that there is a case for banning such machines and insisting there should always be a human involved at the crucial moment. Alex’s case is that whereas a human is always capable of changing his or her mind, of exercising mercy and compassion, a machine just delivers, mindlessly and efficiently.

Abu Graib

Abu Graib

Will Wilson argues, on the other hand, that no human can be expected to analyse and make use of all the information that must be analysed in today’s armed conflicts – it must be automated. Machines, he insists, are more efficient, more capable of doing the job and less likely to make mistakes. Whilst recognising that only humans can be merciful and compassionate, he pointed out that humans can also be malicious and vindictive. The Abu Graib atrocities were not, after all, undertaken by machines.

Two soldiers fighting

Two soldiers fighting

Alex’s claim was vividly illustrated for me the following day when, discussing the weekend with a friend, he told me about a friend of his father’s who, during the second world war was about to kill a German soldier when he recognised him as his old German teacher. The German recognised him too and they decided instead to toss a coin to see who would take the other prisoner. The Englishman won. No-one died.

No machine could have recognised the German combatant as his old teacher, and seen that as a reason to hold his fire. Surely this is an excellent reason to insist that it is always a human who pulls the trigger?

On the other hand, if the soldier hadn’t been his old German teacher, he would have shot him and – well – that’s the end of it. One soldier kills another. Surely that’s what war is all about? Would the introduction of machines change that? Wouldn’t it just mean that the soldiers on each side are safer both physically and mentally. After all if machines do the dirty work, who cares if they get blown up? No machine will ever suffer post-traumatic stress, or burst into tears many years later at the thought of what they did during the war (as, in his eighties, my father did).

Robot

Robot

There is another problem, of course, with fully automated weapons. Will they ever be sufficiently free of risk to be worth it? They will have to learn, of course, and once a machine can learn its behaviour will not be fully predictable. Maybe a machine is capable of the sort of atrocity a human is? Could a robot run amok and kill all the villagers it is supposed to protect? If so who is to blame? Not the machine – machines are not moral agents. The programmer perhaps? Alex believes that the problem is not the ‘responsibility gap’ because the programmer can be blamed. But risk is a problem – will we ever be sure enough that a machine will not do this? Wil has faith in technology – he believes that we will reach a stage where the risk –albeit it there – will be worth taking for the sake of the humans who would otherwise be risking their lives.

This was an extraordinary weekend. Very enjoyable and extraordinarily thought-provoking. The weekend came about because, after one of my open day lectures, I was approached by a man who told me he was one of the officers contributing to army policy on fully automated weapons (or ‘killer robots’ as the tabloids would have it). We had a fascinating conversation and I asked him if he would so a weekend school for us.

Normally, of course, my speakers are philosophers, but this seemed too good an opportunity to miss. Paul Gittins agreed. In the event, however, Paul was posted to the Gulf and instead he sent Wil Wilson, who did an excellent job.

During the weekend one of the sessions was a ‘conversation’ between Wil and Alex and many people found it extremely illuminating. I shall certainly consider incorporating that into future weekends.

In the meantime what do you think – should we ban fully automated weapons or do you think they should be permitted?

Here is some extra reading:

Inside the Pentagon’s Effort to Build a Killer Robot (Time Magazine)

Human Rights Watch Campaign to Stop Killer Robots

Do Killer Robots Violate Human Rights (The Atlantic)



Viewing all articles
Browse latest Browse all 20

Latest Images

Trending Articles





Latest Images