Abstract
Technological developments in the sphere of artificial intelligence (AI) inspire debates about the implications of autonomous weapon systems (AWS), which can select and engage targets without human intervention. While increasingly more systems which could qualify as AWS, such as loitering munitions, are reportedly used in armed conflicts, the global discussion about a system of governance and international legal norms on AWS at the United Nations Convention on Certain Conventional Weapons (UN CCW) has stalled. In this article we argue for the necessity to adopt legal norms on the use and development of AWS. Without a framework for global regulation, state practices in using weapon systems with AI-based and autonomous features will continue to shape the norms of warfare and affect the level and quality of human control in the use of force. By examining the practices of China, Russia, and the United States in their pursuit of AWS-related technologies and participation at the UN CCW debate, we acknowledge that their differing approaches make it challenging for states parties to reach an agreement on regulation, especially in a forum based on consensus. Nevertheless, we argue that global governance on AWS is not impossible. It will depend on the extent to which an actor or group of actors would be ready to take the lead on an alternative process outside of the CCW, inspired by the direction of travel given by previous arms control and weapons ban initiatives.