Make Them Rare or Make Them Care: Artificial Intelligence and Moral Cost-Sharing
Abstract
The use of autonomous weaponry in warfare has increased substantially over the last twenty years and shows no sign of slowing. Our chapter raises a novel objection to the implementation of autonomous weapons, namely, that they eliminate moral cost-sharing. To grasp the basics of our argument, consider the case of uninhabited aerial vehicles that act autonomously (i.e., LAWS). Imagine that a LAWS terminates a military target and that five civilians die as a side effect of the LAWS bombing. Because LAWS lacks moral agency, and in particular the capacity for moral emotions, moral cost-sharing is limited to dead civilians and their loved ones. We argue that's unjust insofar as those responsible for unjust harm to others ought to share in those costs. Our worry expands to other strategic uses of AI, e.g., cyber warfare. This presents a dilemma: Either we design autonomous weaponry capable of moral emotions, or we limit the use of autonomous weaponry. The former undermines the risk-mitigation purpose of creating autonomous weaponry and expands the number of sentient individuals whose welfare is risked in war. The latter risks worsening combatant casualties and achieving strategic aims.