April 14, 2015
Statement to the UN Convention on Certain Conventional Weapons on Technical Issues
CNAS is an independent non-profit research institution that has had an ongoing project on the legal, ethical, moral, and policy implications of autonomous weapons. I would like to share with you some of the findings from our research on technical issues.
Today, with a few isolated exceptions, lethal, fully autonomous weapons do not exist. However, automation and autonomy continue to be incorporated into a wide variety of functions, including those relating to the use of force. Even if efforts to pre-emptively ban the use of lethal autonomous weapon systems are largely successful, some states are likely to continue to pursue their development and – potentially – to one day deploy them. The international community has wisely chosen to begin preparing for a time when states have the ability to build weapon systems that can autonomously select and engage targets without human intervention.
Lethal autonomous weapon systems are not inherently counter to the principles of international humanitarian law. If properly designed and employed in a manner that is consistent with the requirements of proportionality and distinction, the use of such systems could be lawful. For this reason, it will be critical to develop shared expectations about standards for testing, evaluation, employment, and accountability.
A number of states and non-governmental organizations have expressed concerns that LAWS cannot distinguish between military and civilian contexts and therefore could select and engage a military target located in a civilian environment, resulting in civilian casualties. Such concerns are valid but could be mitigated by restrictions on the circumstances in which LAWS can be employed. For example, use of LAWS could be limited to the undersea domain or to other areas where civilians are not present.
Likewise, some concerns about the use of LAWS could be mitigated by design characteristics that constrain their freedom of operation. These constraints could include limits on both the length of time and the geographic area in which the system is allowed to operate. This would, in turn, increase a commander’s control over the system and strengthen accountability for the system’s use. LAWS could also be designed to target only military hardware – providing an additional layer of protection against the targeting of civilian.
Regardless of the system’s exact degree of autonomy, states should implement strict testing and evaluation standards to ensure that the system performs as intended and that procedures for its use are clear to operators. Systems should be tested in a realistic operating environment to ensure that they continue to operate in accordance with their design features.
Finally, it will be essential for states to further explore the ways in which accountability for the use of LAWS can be established and enforced. While such an undertaking will be undoubtedly challenging, it will be critical in ensuring that the use of LAWS adheres to international humanitarian law.
Such issues represent important areas for discussion. Failure to establish a shared understanding of appropriate standards for testing, evaluation, employment, and accountability could have profound consequences for the future of international security.
These are important technical issues that should be considered during discussion this week.
More from CNAS
-
VideoWill WWIII Be Fought By Robots?
What will autonomous weapons mean for how future wars are waged and the loss of human lives in armed conflicts? That's the topic of a new book, Army of None: Autonomous Weapon...
By Paul Scharre
-
CommentaryA Million Mistakes a Second
Militaries around the globe are racing to build ever more autonomous drones, missiles, and cyberweapons. Greater autonomy allows for faster reactions on the battlefield, an ad...
By Paul Scharre
-
CommentarySix arrested after Venezuelan president dodges apparent assassination attempt
Venezuelan President Nicolas Maduro was speaking at a military event when a drone carrying plastic explosives detonated on Saturday. CNAS Technology and National Security Dire...
By Paul Scharre
-
Army of None: Autonomous Weapons and the Future of War
What happens when a Predator drone has as much autonomy as a Google car? Or when a weapon that can hunt its own targets is hacked? Although it sounds like science fiction, the...
By Paul Scharre