The UK government is refusing to back a ban on building “killer robots,” the Guardian reports.
Killer robots may sound like science fiction but their regulation is a debate that has become increasingly important. In technical terms, they’re Lethal Autonomous Weapons Systems (Laws): Machines that have the capability to kill without any human actively giving the order.
While a drone might automatically detect possible threats, it takes a human pilot on the ground to actually pull the trigger. A killer robot, in contrast, might use lethal force automatically, without ever checking in with a human controller.
Killer robots don’t exist yet. And there’s a growing campaign to outlaw them before they can ever come into existence. It’s all spearheaded by the bluntly-named coalition of human rights groups called the “Campaign to Stop Killer Robots.”
One of the fundamental problem a lot of people have with killer robots is accountability. When a soldier shoots a child, it’s clear they are to blame. But if a killer robot accidentally slaughters civilians — who is it at fault? The programmer? The commander? The purchaser? The repairman?
The Campaign against the technology says that “giving machines the power to decide who lives and dies on the battlefield is an unacceptable application of technology. Human control of any combat robot is essential to ensuring both humanitarian protection and effective legal control.”
So why is the UK government opposing a ban on the technology? A spokesperson told the Guardian that “at present, we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area.”
The Foreign Office adds: “The United Kingdom is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control. As an indication of our commitment to this, we are focusing development efforts on remotely piloted systems rather than highly automated systems.”
For this reason, the UK government has opposed such a ban proposed at a UN conference in Geneva this week.
A retired lawyer and military veteran also gave the Guardian an example of when he thinks using killer robots could be advantageous:
One scenario he suggested was where a young soldier might be ordered to clear a house of enemy troops. He might, from a bright sunlit street, enter a dark room, detect movement and in perceived self-defence open fire killing a mother and her young children.
“Who will say that a piece of machinery might not one day be developed capable of differentiating [between armed soldiers and non-combatants]?” Boothby said.