Abstract
The debate on the ethical permissibility of autonomous weapon systems (AWS) is deadlocked. It could therefore benefit from a differentiated assignment of the burden of proof. This is because the discussion is not purely philosophical in nature, but has a legal and security policy component and aims to avoid the most harmful outcomes of an otherwise unchecked development. Opponents of a universal AWS ban must clearly demonstrate that AWS comply with the Law of Armed Conflict (LOAC). This requires extensive testing of specific models to verify compliance in realistic combat situations; working on appropriate testing regimes is also one of the most important future tasks for AWS ethics. In this context, however, it is easy to refute the claim put forward by prohibitionists that AWS violate LOAC a priori. A state’ s right to ensure security and the protection of human rights through military capacity has a high ethical value. Other arguments against the use of AWS that are not based on LOAC conformity must therefore meet a substantial burden of proof, as they call on governments to abandon a military technology which is at least potentially transformative. None of those objections meet this standard. They include not only culturally or even individually divergent norms and ideas regarding the use of weapons, but also consequentialist (security policy) concerns on the one hand and so-called deontological arguments on the other. While the former at least point to the need for essential regulation, the latter are based on rather dubious premises. For the purposes of solutions beneficial to humanity, prohibitionist should formulate narrower and more specific points of criticism.