Skip to content
Link copied to clipboard

U.N. report questions the use of killer robots

UNITED NATIONS - Killer robots that can attack targets without any human input "should not have the power of life and death over human beings," a new draft U.N. report says.

UNITED NATIONS - Killer robots that can attack targets without any human input "should not have the power of life and death over human beings," a new draft U.N. report says.

The report for the U.N. Human Rights Commission posted online this week deals with legal and philosophical issues involved in giving robots lethal powers over humans, echoing countless science-fiction novels and films. The debate dates to author Isaac Asimov's first rule for robots in the 1942 story "Runaround."

"A robot may not injure a human being or, through inaction, allow a human being to come to harm," the rule stated.

Report author Christof Heyns, a South African professor of human-rights law, calls for a worldwide moratorium on the "testing, production, assembly, transfer, acquisition, deployment and use" of killer robots until an international conference can develop rules for their use.

His findings are due to be debated at the Human Rights Council in Geneva on May 29.

According to the report, the United States, Britain, Israel, South Korea and Japan have developed various types of fully or semiautonomous weapons.

In the report, Heyns focuses on a new generation of weapons that choose their targets and execute them. He calls them "lethal autonomous robotics," or LARs for short, and says: "Decisions over life and death in armed conflict may require compassion and intuition. Humans - while they are fallible - at least might possess these qualities, whereas robots definitely do not."

He notes the arguments of robot proponents that death-dealing autonomous weapons "will not be susceptible to some of the human shortcomings that may undermine the protection of life. Typically they would not act out of revenge, panic, anger, spite, prejudice or fear. Moreover, unless specifically programmed to do so, robots would not cause intentional suffering on civilian populations, for example through torture. Robots also do not rape."

The report goes beyond the recent debate over drone killings of al-Qaeda suspects and nearby civilians who are maimed or killed in the air strikes. Drones do have human oversight. The killer robots are programmed to make autonomous decisions on the spot without orders from humans.