The U.N. Will Not Stand for Killer Robots

A model of the "Terminator" robot

A model of the "Terminator" robot David Ramos/AP File Photo

Many U.N. advocates would rather trust a human to pull a trigger than leave it to a machine set to autopilot.

President Obama may have finally clarified the U.S. position on armed assassins in the sky, but the next wave of drone controversy may now center on whether robots on the field of battle are smart enough to gun down human beings. At a the meeting of the United Nations Human Rights Council in Geneva on Thursday, a top U.N. official on execution gave the world his best Sarah Connor impression, urging for a moratorium on terminators Lethal Autonomous Robotics (LARs), a warning he hopes will stop a future of killer robots that may be past the point of no return if leading military technologists have anything to say about it. "War without reflection is mechanical slaughter.... A decision to allow machines to be deployed to kill human beings worldwide — whatever weapons they use — deserves a collective pause," said Heyns, the U.N.'s special rapporteur on extrajudicial, summary or arbitrary executions. That is one fancy title, but his message is simple, familiar, and likely in vain: Many advocates would still rather trust a human to pull a trigger than leave it to SkyNet, or, well, a machine set to autopilot by the U.S., Israeli, British, or Korean military.  

But, yes, the the United Nations listened to debate about killer robots. Thursday's session came just three weeks after the United States Congress conducted a hearing about other Earths because, well, the line between reality and science fiction are closing in fast enough for the world to truly weigh in. Currently, there are no fully autonomous and armed robots in action — early attempts have gone awry, and while the Pentagon has not been shy in wanting to developstand-alone shooters, they've insisted by official policy that a human being will always be "in the loop." Human Rights Watch and Harvard Law School, gleaning information from the U.S. Air Force, have reported that "by 2030 machine capabilities will have increased to the point that humans have become the weakest component in a wide array of systems and processes." So, by the time Suri Cruise is 24, humans really starte to be the weakest links on the battlefield. In the meantime, a few superpowers and would-be-superpowers are building up their LAR arsenals.

Read more at The Atlantic Wire