February 20, 2016
Autonomous Weapons in the News
New CNAS Research on Autonomous Weapons
CNAS Adjunct Senior Fellow Michael C. Horowitz released two papers this week on issues surrounding autonomous weapons.
Public opinion and the politics of the killer robots debate
Published in the academic journal Research & Politics, this paper explores public attitudes about autonomous weapons and concern that mass public opposition means autonomous weapon systems violate the Martens Clause of the Hague Convention. The paper shows that context, including protection of troops and the relative effectiveness of the weapons, plays a crucial role in shaping public attitudes. Given the way context influences public attitudes, the paper concludes that there is a need for modesty when making claims about public perceptions concerning autonomous weapons.
The ethics and morality of robotic warfare: assessing the debate over autonomous weapons
This draft working paper focuses on discussions concerning the ethics and morality of lethal autonomous weapon systems (LAWS). It argues that the category of LAWS is so broad that it may make sense to analyze LAWS in three categories: munitions, platforms, and operations planning systems. The paper then uses that framework to assess ethical and moral debates about the controllability of LAWS, accountability and responsibility concerns, and human dignity.
Autonomous weapons in the news
Australian professor warns killer robots, invisible drones are real
(International Business Times) Vittorio Hernandez describes an Australian professor who, through analysis of military budgets, inferred various autonomous weapon technologies in development. These include biological systems and various stealth technologies for drones.
If killer robots arrive, the Terminator will be the least of our problems
(The Washington Post) Matt McFarland discusses some of the dangers of autonomous weapons and the lack of regulations. He raises the issue of autonomous weapon proliferation and specifically the possibility of non-state actors acquiring these technologies. He argues current negotiations regarding regulations are moving at a “glacial” pace and must be stepped up.
In Munich, a frightening preview of the rise of killer robots
(The Washington Post) David Ignatius reports on the recent Munich Security Conference. He discusses the increased likelihood of autonomous weapons and how it could influence future war. He goes on to discuss other security issues at the conference such as data security and non-state actors.
It is Time we considered a Ban on Autonomous Weapons
(The Monitor Daily) John Birks reports on a recent panel held by the World Economic Forum at Davos concerning a potential autonomous weapons ban. The panel discusses some of the ethical, moral, and legal challenges presented by autonomous weapons and put forth a call for a ban on autonomous weapons. The panel described it as a “preemptive” ban that would prevent autonomous weapons from becoming developed enough to be dangerous.
Robots Can Learn Ethical Behavior By Reading Children's Stories
(Tech Times) Katherine Derla discusses a new technology that could “train” artificial intelligence to behave in social settings. Through reading children’s’ books the robots will take in and adopt rational human decisions in order to create some sort of moral reasoning. The technology, named Quixote, would “promote the options that will not cause harm to humans.”
Should We Fear an AI Arms Race?
(Defense One) Andrew Lohn, Andrew Parasiliti, and William Welser IV lay out what they see as the five main risks of autonomous weapons: control, hacking, targeting, mistakes, and liability. Despite these issues, they believe regulation would “maximize benefits while minimizing risk” of autonomous weapons. They argue that regulation would be more useful than the ban some tech titans (Elon Musk, Stephen Hawking, and Steven Wozniak) have called for.
There's a Pointless War Being Waged on Killer Robots From the Future
(Vice News) Ryan Faith argues that a ban on autonomous weapons is redundant because of the various developments that have already dehumanized war such as homing cruise missiles or radar guided weapons. Similarly, Faith argues that the moral reasons for an autonomous weapons ban are not valid since there would still be a human “in the loop” giving the order to launch the system, just as the President launches a nuclear missile.
Special thanks to University of Pennsylvania researcher Carter Goodwin for pulling together this news roundup.
More from CNAS
-
VideoWill WWIII Be Fought By Robots?
What will autonomous weapons mean for how future wars are waged and the loss of human lives in armed conflicts? That's the topic of a new book, Army of None: Autonomous Weapon...
By Paul Scharre
-
CommentaryA Million Mistakes a Second
Militaries around the globe are racing to build ever more autonomous drones, missiles, and cyberweapons. Greater autonomy allows for faster reactions on the battlefield, an ad...
By Paul Scharre
-
CommentarySix arrested after Venezuelan president dodges apparent assassination attempt
Venezuelan President Nicolas Maduro was speaking at a military event when a drone carrying plastic explosives detonated on Saturday. CNAS Technology and National Security Dire...
By Paul Scharre
-
Army of None: Autonomous Weapons and the Future of War
What happens when a Predator drone has as much autonomy as a Google car? Or when a weapon that can hunt its own targets is hacked? Although it sounds like science fiction, the...
By Paul Scharre