Problems of Using Autonomous Military AI Against the Background of Russia's Military Aggression Against Ukraine

Baltic Journal of Legal and Social Sciences 2022 (4):131-145 (2022)
  Copy   BIBTEX

Abstract

The application of modern technologies with artificial intelligence (AI) in all spheres of human life is growing exponentially alongside concern for its controllability. The lack of public, state, and international control over AI technologies creates large-scale risks of using such software and hardware that (un)intentionally harm humanity. The events of recent month and years, specifically regarding the Russian Federation’s war against its democratic neighbour Ukraine and other international conflicts of note, support the thesis that the uncontrolled use of AI, especially in the military sphere, may lead to deliberate disregard for the moral standards of controlled AI or the spontaneous emergence of aggressive autonomous AI. The development of legal regulation for the use of technologies with AI is prolonged concerning the rapid development of these artefacts, which simultaneously cover all areas of public relations. Therefore, control over the creation and use of AI should be carried out not only by purely technical regulation (e.g., technical standards and conformance assessments, corporate and developer regulations, requirements enforced through industry-wide ethical codes); but also by comprehensive legislation and intergovernmental oversight bodies that codify and enforce specific changes in the rights and duties of legal persons. This article shall present the “Morality Problem” and “Intentionality Problem” of AI, and reflect upon various lacunae that arise when implementing AI for military purposes.

Links

PhilArchive

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Military Robots and the Question of Responsibility.Lamber Royakkers & Peter Olsthoorn - 2014 - International Journal of Technoethics 5 (1):01-14.
Negotiating autonomy and responsibility in military robots.Merel Noorman & Deborah G. Johnson - 2014 - Ethics and Information Technology 16 (1):51-62.
The Ethics of Autonomous Military Robots.Jason Borenstein - 2008 - Studies in Ethics, Law, and Technology 2 (1).
Employing Lethal Autonomous Weapon Systems.Matti Häyry - 2020 - International Journal of Applied Philosophy 34 (2):173-181.
The Unfounded Bias Against Autonomous Weapons Systems.Áron Dombrovszki - 2021 - Információs Társadalom 21 (2):13–28.
Doctor of Philosophy Thesis in Military Informatics (OpenPhD ) : Lethal Autonomy of Weapons is Designed and/or Recessive.Nyagudi Nyagudi Musandu - 2016-12-09 - Dissertation, Openphd (#Openphd) E.G. Wikiversity Https://En.Wikiversity.Org/Wiki/Doctor_of_Philosophy , Etc.
Autonomous Weapons and Distributed Responsibility.Marcus Schulzke - 2013 - Philosophy and Technology 26 (2):203-219.
Ethics, law, and military operations.David Whetham (ed.) - 2011 - New York, NY: Palgrave-Macmillan.

Analytics

Added to PP
2023-02-07

Downloads
273 (#70,633)

6 months
173 (#15,580)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Tyler Jaynes
University of Utah

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references