July 29, 2016
Who’ll want artificially intelligent weapons? ISIS, democracies, or autocracies?
One of the biggest fears about the nexus of artificial intelligence and the military is that machine learning—a type of artificial intelligence that allows computers to learn from new data without being explicitly programmed—could spread rapidly in military systems, and even risk an arms race. And the alarm over the consequences of robots armed with weapons even extended to the recent use of a remotely piloted, rather than autonomous, bomb disposal robot retrofitted with an explosive by the Dallas Police Department in July 2016. That event triggered a wave of articles about the consequences of robots with weapons, especially when used outside the military.
Discussions of the military applications of robotics have tended to focus on the United States, largely because of America’s extensive use of uninhabited (also called unmanned) aerial vehicles, or “drones,” to conduct surveillance and launch lethal strikes against suspected militants around the world. Yet limiting the discussion of military robotics to those developed by wealthy, democratic countries such as the United States may miss important underlying trends.
Read the full article on the Bulletin of the Atomic Scientists.
More from CNAS
-
PodcastFuture of Life Institute: AI and Nuclear Weapons – Trust, Accidents, and New Risks with Paul Scharre and Mike Horowitz
In 1983, Soviet military officer Stanislav Petrov prevented what could have been a devastating nuclear war by trusting his gut instinct that the algorithm in his early-warning...
By Michael Horowitz & Paul Scharre
-
CommentaryThe Algorithms of August
An artificial intelligence arms race is coming. It is unlikely to play out in the way that the mainstream media suggest, however: as a faceoff between the United States and Ch...
By Michael Horowitz
-
CommentaryBeyond Killer Robots: How Artificial Intelligence Can Improve Resilience in Cyber Space
Recently, one of us spent a week in China discussing the future of war with a group of American and Chinese academics. Everyone speculated about the role of artificial intelli...
By Michael Sulmeyer & Kathryn Dura
-
CommentaryNew defense policy a reminder that US is not alone in AI efforts
The National Defense Authorization Act (NDAA) for fiscal year 2019 is evidence the United States is developing a more robust artificial intelligence (AI) strategy. The new law...
By Kathryn Dura