Abstract

The development artificial intelligence and its uses for lethal purposes in war will fundamentally change the nature of warfare as well as law-enforcement and thus pose fundamental problems for the stability of the international system. To cope with such changes, states should adopt preventive security governance frameworks based upon the precautionary principle of international law, and upon previous cases where prevention brought stability to all countries. Such new global governance frameworks should be innovative as current models will not suffice. The World Economic Forum has advanced that the two areas that will bring most benefits but also biggest dangers to the future are robotics and artificial intelligence. Additionally, they are also the areas in most urgent need for innovative global governance.

Leading scientists working on artificial intelligence have argued that the militarization and use of lethal artificial intelligence would be a highly destabilizing. Here I examine twenty-two existing treaties that acted under a “preventive framework” to establish new regimes of prohibition or control of weapons systems that had been deemed to be destabilizing. These treaties achieved one or all of three goals: prevented further militarization, made weaponization unlawful, and stopped proliferation with cooperative frameworks of transparency and common rules. As a result of my findings, it is clear that there is a significant emerging norm in regards to all weapons systems: the utilization of disarmament and arms regulations as a tool and mechanism to protect civilians. The development of lethal autonomous weapons systems would severely jeopardize this emerging norm. I show under what conditions lethal autonomous weapons systems will be disruptive for peace and security and show alternative governance structures based upon international law with robust precautionary frameworks.

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/about_us/legal/notices)
You do not currently have access to this article.