May 30, 2018

Technology Roulette

Managing Loss of Control as Many Militaries Pursue Technological Superiority

By Richard Danzig

Executive Summary

This report recognizes the imperatives that inspire the U.S. military’s pursuit of technological superiority over all potential adversaries. These pages emphasize, however, that superiority is not synonymous with security. Experience with nuclear weapons, aviation, and digital information systems should inform discussion about current efforts to control artificial intelligence (AI), synthetic biology, and autonomous systems. In this light, the most reasonable expectation is that the introduction of complex, opaque, novel, and interactive technologies will produce accidents, emergent effects, and sabotage. In sum, on a number of occasions and in a number of ways, the American national security establishment will lose control of what it creates. 

A strong justification for our pursuit of technological superiority is that this superiority will enhance our deterrent power. But deterrence is a strategy for reducing attacks, not accidents; it discourages malevolence, not inadvertence. In fact, technological proliferation almost invariably closely follows technological innovation. Our risks from resulting growth in the number and complexity of interactions are amplified by the fact that proliferation places great destructive power in the hands of others whose safety priorities and standards are likely to be less ambitious and less well funded than ours. 

Accordingly, progress toward our primary goal, superiority, should be expected to increase rather than reduce collateral risks of loss of control. This report contends that, unfortunately, we cannot reliably estimate the resulting risks. Worse, there are no apparent paths for eliminating them or even keeping them from increasing. The benefit of an often referenced recourse, keeping “humans in the loop” of operations involving new technologies, appears on inspection to be of little and declining benefit. 

We are not, however, impotent. With more attention the American military at least can dampen the likely increase in accidents and moderate adverse consequences when they occur. Presuming that the United States will be a victim of accidents, emergent effects, and sabotage, America should improve its planning for coping with these consequences. This will involve reallocating some of the immense energy our military invests in preparing for malevolence to planning for situations arising from inadvertent actions and interactions. 

The U.S. Department of Defense and intelligence agencies also must design technologies and systems not just for efficacy and efficiency, but also with more attention to attributes that can mitigate the consequences of failure and facilitate resilient recovery. The pervasive insecurity of digital information systems should be an object demonstration that it is exceedingly costly, perhaps impossible, to attempt to counter loss of control after we have become dependent on a new technology, rather than at the time of design. 

Most demandingly, the United States also must work with its opponents to facilitate their control and minimize their risks of accidents. Twenty-first century technologies are global not just in their distribution, but also in their consequences. Pathogens, AI systems, computer viruses, and radiation that others may accidentally release could become as much our problem as theirs. Agreed reporting systems, shared controls, common contingency plans, norms, and treaties must be pursued as means of moderating our numerous mutual risks. The difficulty of taking these important steps should remind us that our greatest challenges are not in constructing our relationships to technologies, it is in constructing our relationships with each other. 

These arguments are made to the national security community. These reflections and recommendations, however, should transcend their particulars and have implications for all discussion about control of dangerous new technologies. 

The full report is available online.

Download PDF

Authors

  • Richard Danzig

    CNAS Board of Directors, Senior Advisor, Johns Hopkins Applied Physics Laboratory

    Richard Danzig’s primary activity is as a consultant to U.S. Intelligence Agencies and the Department of Defense on national security issues.  He is a Senior Advisor to the Jo...

  • Reports
    • November 30, 2018
    Summary of Findings and Recommendations

    KEY FINDINGS Soldier survivability is a function of protection and other relevant operational factors, such as situational awareness, mobility, and lethality. Throughout histo...

    By Paul Scharre, Lauren Fish, Katherine Kidder & Amy Schafer

  • Reports
    • November 7, 2018
    Human Performance Enhancement

    Executive Summary No attributes are more foundational to success in combat than the physical and cognitive performance of warfighters. Technological advantage has always playe...

    By Paul Scharre & Lauren Fish

  • Reports
    • November 7, 2018
    Human Performance Enhancement TEST

    No attributes are more foundational to success in combat than the physical and cognitive performance of warfighters. Technological advantage has always played a central role i...

    By Paul Scharre & Lauren Fish

  • Podcast
    • November 2, 2018
    Ep. 27: CENTCOM's Gen. Votel; Exosuits and super soldiers; Weaponizing social media and more

    This week on the program: • During a flight over Turkmenistan this week, America’s top commander in the Middle East spoke by phone with Defense One Executive Editor Kevin Baro...

    By Paul Scharre & Lauren Fish

View All Reports View All Articles & Multimedia