Beware the killer robots

GlobalPost
Updated on
The World

SAN FRANCISCO — Technology has changed the nature of warfare from the Bronze Age to the Atomic Age. Now a new report focuses on the next stage in military technology: the imminent arrival of weapons designed to fight on their own.

"The use of military robotics represents a new era in warfare," writes co-author Patrick Lin, a philosopher with the Ethics & Emerging Technologies Group at California State Polytechnic University, in the report, "Autonomous Military Robotics: Risk, Ethics, and Design,".

"Robots are not merely another asset in the military toolbox, but they are meant to also replace human soldiers," the report continues. "As such, they raise novel ethical and social questions that we should confront as far in advance as possible."

The 108-page report, prepared for the U.S. Navy’s Office of Naval Research, does not predict when independent killing machines might go to war, nor take a firm stand for or against their deployment. Instead it suggests that advances in artificial intelligence, coupled with the growing use of semi-autonomous robots in battle, make it a question of when, not if, robots will become capable of making life and death decisions.

The report says the development of military robots stems from a desire to minimize casualties by American troops.

"Two key Congressional mandates are driving the use of military robotics," the authors write; the first urges a significant deployment of unmanned deep-strike aircraft by 2010, and the second mandates the widespread use of driverless combat vehicles by 2015. These first-generation machines are not intended to act independently. "Most, if not all, of the robotics in use and under development are semi-autonomous at best," the report says, adding that, "the technology to (responsibly) create fully autonomous robots is near but not quite in hand."

But war is driving the pace of development. The report says an estimated 5,000 semi-autonomous robots have been used in Iraq and Afghanistan. One common application is the use of unmanned aerial vehicles (UAVs) to provide reconnaissance. Some UAVs, like the aptly named Predator, are designed to carry out air strikes. "They can navigate autonomously toward targets specified by GPS coordinates, but a remote operator located in Nevada (or in Germany) makes the final decision to release the missiles," the report says.

The technological and moral considerations involved in putting that killing decision on autopilot make up the body of the report. The authors tip their hat to science fiction writer Isaac Asimov who wrote about the three laws of robotics — meant to prevent robots from harming humans — in stories that showed how these mandates could be subverted. That sets the stage for considering how much more difficult it would be to create rules to tell a machine when and whom to kill. The report calls this a top-down approach and says a designer can’t hope to anticipate all the eventualities. But if builders opt for the alternative bottoms-up approach, creating an artificial intelligence that can learn, how can they be confident that their machines will absorb the desired lessons?

Even so the report entertains the possibility that properly designed robots might be less prone to commit atrocities. It cites a U.S. Army Surgeon General’s survey of troops in Iraq that found less than half of soldiers and Marines believed non-combatants should be treated with respect and dignity. "In the not too distant future," the Cal Poly authors say, "relatively autonomous robots may be capable of conducting warfare in a way that matches or exceeds the traditional jus in bello morality of a human soldier."

The United States is not the only nation developing robots for advanced roles in war and peace, while also mulling how to control their behavior. The report notes that an academic conference in the United Kingdom recently heard warnings that the world was on the verge of a robotic arms race. "Robotics is a particularly thriving and advanced industry in Asia," the Cal Poly authors write, citing an effort underway in South Korea to develop a robot ethics charter "though the document has yet to materialize."

"Autonomous Military Robotics" may sound tame to Americans raised on movies like "2001: A Space Odyssey" and "Terminator", in which intelligent machines turn on their creators. But in a scholarly way the report reminds us that technology is posed to bring science fiction to life.

More GlobalPost dispatches on technology:

The stimulus and broadband

Venture capital and the global economic crisis

Sign up for our daily newsletter

Sign up for The Top of the World, delivered to your inbox every weekday morning.