Employing Lethal Autonomous Weapon Systems

International Journal of Applied Philosophy 34 (2):173-181 (2020)
  Copy   BIBTEX

Abstract

The ethics of warfare and military leadership must pay attention to the rapidly increasing use of artificial intelligence and machines. Who is responsible for the decisions made by a machine? Do machines make decisions? May they make them? These issues are of particular interest in the context of Lethal Autonomous Weapon Systems. Are they autonomous or just automated? Do they violate the international humanitarian law which requires that humans must always be responsible for the use of lethal force and for the assessment that civilian casualties are proportionate to the military goals? The article analyses relevant documents, opinions, government positions, and commentaries using the methods of applied ethics. The main conceptual finding is that the definition of autonomy depends on what the one presenting it seeks to support. Those who want to use lethal autonomous weapon systems call them by another name, say, automated instead of autonomous. They impose standards on autonomy that machines do not meet, such as moral agency. Those who wish to ban the use of lethal autonomous weapon systems define them much less broadly and do not require them to do much more than to be a self-standing part of the causal chain.The article’s argument is that the question of responsibility is most naturally perceived by abandoning the most controversial philosophical considerations and simply stating that an individual or a group of people is always responsible for the creation of the equipment they produce and use. This does not mean that those who press the button, or their immediate superiors, are to blame. They are doing their jobs in a system. The ones responsible can probably be found in higher military leadership, in political decision-makers who dictate their goals, and, at least in democracies, in the citizens who have chosen their political decision-makers.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 92,674

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Doctor of Philosophy Thesis in Military Informatics (OpenPhD ) : Lethal Autonomy of Weapons is Designed and/or Recessive.Nyagudi Nyagudi Musandu - 2016-12-09 - Dissertation, Openphd (#Openphd) E.G. Wikiversity Https://En.Wikiversity.Org/Wiki/Doctor_of_Philosophy , Etc.
Feasible Precautions in Attack and Autonomous Weapons.Jeffrey S. Thurnher - 2018 - In Wolff Heintschel von Heinegg, Robert Frau & Tassilo Singer (eds.), Dehumanization of Warfare: Legal Implications of New Weapon Technologies. Springer Verlag. pp. 99-117.
Autonomous Weapon Systems: Failing the Principle of Discrimination.Ariel Guersenzvaig - 2018 - IEEE Technology and Society Magazine 37 (1):55-61.
Meaningful Human Control – and the Politics of International Law.Thilo Marauhn - 2018 - In Wolff Heintschel von Heinegg, Robert Frau & Tassilo Singer (eds.), Dehumanization of Warfare: Legal Implications of New Weapon Technologies. Springer Verlag. pp. 207-218.

Analytics

Added to PP
2021-03-27

Downloads
75 (#224,280)

6 months
16 (#171,418)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references