The following is an English translation of a news story that appeared in the July 1, 2013, edition of the Russian news site Lenta. SMU Dedman School of Law professor Christopher Jenks, an internationally respected expert on the law of armed conflict, provided expertise for this story.
July 2, 2013
"Humanity should not surrender meaningful control over questions of life and death to machines." This is not a quote from the fictional universe of the 1984 movie "The Terminator," in which the self-aware supercomputer Skynet orders the destruction of the human race. This is an appeal by United Nations Special Rapporteur Christof Heyns for a global ban on lethal autonomous robotics (LARs).
The call for a moratorium by Heyns, a human rights law professor at the University of Pretoria in South Africa, is similar to the Human Rights Watch campaign to ban "killer robots."
Southern Methodist University Dedman School of Law professor Christopher Jenks, an expert in armed conflict, calls such a ban "unrealistic, belated and short-sighted, and ignores mankind's frailties." Noting the moral, ethical and legal issues associated with LARS, Jenks says, "I am not advocating their use, but I object to an outright ban."
Many are unaware that defense autonomous systems have existed for over 20 years and already have mistakenly taken human life in armed conflict, Jenks says.
"In each of the Gulf wars, U.S. Patriot missile systems misidentified targets and launched a missile, shooting down a U.S military aircraft in one Gulf war and a British military aircraft in the other, killing both pilots," Jenks explains. "Five years ago, South Africa also had a tragic instance of what was called a 'robot cannon' malfunctioning, killing nine South African soldiers."
Jenks believes a ban would ignore "the problematic nature of human judgment and error in the targeting process."
"As Ben Wittes at the Brookings Institute has said, LARS did not take a single human life in World War II, or Rwanda or Cambodia — mankind did that all on its own," he says. "At a minimum, it's at least possible that at some point in the not-too-distant future LARs may be able to better distinguish between combatants and civilians, from a greater distance and more accurately, than humans, and thus cause fewer civilian casualties."
The U.S. and Israel are the most prominent developers and users of LARS technology, Jenks says, but other countries are testing, developing and using particularly defensive systems.
"I think using offensive systems, if that occurs, will be one of degrees," Jenks says. "For example, there might be less concern about a LARS anti-submarine system. Since there are really no civilian submarines, there would be far less risk of civilian casualties."
For more of the story, see an English translation of the article.
# # #