A new report from the United Nations Human Rights Commission suggests that weapons systems that can attack targets without any human input need to be regulated before they become the military weapons of the future.
Airborne drones are becoming commonplace, especially in the military. A new report from the United Nations Human Rights Commission suggests that lethal autonomous robots need to be regulated before they become the military weapons of the future. The report — which was debated at the Human Rights Council in Geneva on May 29 — states that the United States, Israel, the United Kingdom, South Korea and Japan all possess lethal robots that are either fully or semi-autonomous.
Some of these machines — or “lethal autonomous robotics” (LARS), as they are called in the report — can allegedly choose and execute their own targets without human input. According to a new draft UN report, killer robots that can attack targets without any human input “should not have the power of life and death over human beings”.
Report author Christof Heyns, a South African professor of human rights law, calls for a worldwide moratorium on the “testing, production, assembly, transfer, acquisition, deployment and use” of killer robots until an international conference can develop rules for their use.
The use of unmanned aircraft to carry out bombing missions in the Middle East is already a hotbed issue in the U.S. And recently, killer robots have also been receiving attention from several groups that wish to bring an end to their ongoing development. The report for the UN Human Rights Commission deals with legal and philosophical issues involved in giving robots lethal powers over humans, echoing countless science-fiction novels and films. The debate dates to author Isaac Asimov’s first rule for robots in the 1942 story “Runaround”: “A robot may not injure a human being or, through inaction, allow a human being to come to harm.” In November 2012, Human Rights Watch called for an international ban on fully autonomous robots. And just last month, the Campaign to Stop Killer Robots was launched in London by a coalition of human rights groups demanding a ban on the future development of autonomous weapons. Christof Heyns writes in the report:
“Decisions over life and death in armed conflict may require compassion and intuition. Humans—while they are fallible— at least might possess these qualities, whereas robots definitely do not.”
While the U.N. report focuses mainly on LARS, it also decries the recent upsurge in the use of unmanned aerial vehicles — or drones — by the U.S. military, and other nations.”[Drones] enable those who control lethal force not to be physically present when it is deployed, but rather activate it while sitting behind computers in faraway places, and stay out of the line of fire.”According to the Associated Press, the report cites at least four examples of fully or semiautonomous weapons that have already been developed around the world. The report includes the U.S. Phalanx system for Aegis-class cruisers, which automatically detects, tracks and engages anti-air warfare threats such as antiship missiles and aircraft. Other examples of existing LARS include Israel’s Harpy, an autonomous weapon a “fire-and-forget” autonomous weapon system designed to detect, attack and destroy radar emitters. South Korea’s Samsung Techwin surveillance system, which autonomously detects targets in the demilitarized zone between North and South Korea. Britain’s Taranis jet-propelled combat drone prototype can autonomously search, identify and locate enemies but can only engage with a target when authorized by mission command. It also can defend itself against enemy aircraft. They are currently operated by humans but have an “automatic mode”.
“Lethal autonomous robotics, if added to the arsenals of States, would add a new dimension to this distancing, in that targeting decisions could be taken by the robots themselves. In addition to being physically removed from the kinetic action, humans would also become more detached from decisions to kill — and their execution.”
He notes the arguments of robot proponents that death-dealing autonomous weapons “will not be susceptible to some of the human shortcomings that may undermine the protection of life. Typically they would not act out of revenge, panic, anger, spite, prejudice or fear. Moreover, unless specifically programmed to do so, robots would not cause intentional suffering on civilian populations, for example through torture. Robots also do not rape.”
The report goes beyond the recent debate over drone killings of al-Qaida suspects and nearby civilians who are maimed or killed in the air strikes. Drones do have human oversight. The killer robots are programmed to make autonomous decisions on the spot without orders from humans. Heyns’ report notes the increasing use of drones, which “enable those who control lethal force not to be physically present when it is deployed, but rather to activate it while sitting behind computers in faraway places and stay out of the line of fire.
Current weapons systems are supposed to have some degree of human oversight. But Heyns notes that “the power to override may in reality be limited because the decision-making processes of robots are often measured in nanoseconds and the informational basis of those decisions may not be practically accessible to the supervisor.
In such circumstances humans are de facto out of the loop and the machines thus effectively constitute LARs,” or killer robots.
Separately, another UN expert, British lawyer Ben Emmerson, is preparing a special investigation for the UN General Assembly this year on drone warfare and targeted killings. His probe was requested by Pakistan, which officially opposes the use of US drones on its territory as an infringement on its sovereignty but is believed to have tacitly approved some strikes in the past. Pakistani officials say the drone strikes kill many innocent civilians, which the United States has rejected. The other two countries requesting the investigation were two permanent members of the UN Security Council—Russia and China. In April, an alliance of activist and humanitarian groups led by Human Rights Watch launched the “Campaign to Stop Killer Robots” to push for a ban on fully autonomous weapons. The group applauded Heyns’ draft report in a statement on its website.