Autonomous Machines, Moral Judgment, and Acting for the Right Reasons

Ethical Theory and Moral Practice 18 (4):851-872 (2015)
  Copy   BIBTEX

Abstract

We propose that the prevalent moral aversion to AWS is supported by a pair of compelling objections. First, we argue that even a sophisticated robot is not the kind of thing that is capable of replicating human moral judgment. This conclusion follows if human moral judgment is not codifiable, i.e., it cannot be captured by a list of rules. Moral judgment requires either the ability to engage in wide reflective equilibrium, the ability to perceive certain facts as moral considerations, moral imagination, or the ability to have moral experiences with a particular phenomenological character. Robots cannot in principle possess these abilities, so robots cannot in principle replicate human moral judgment. If robots cannot in principle replicate human moral judgment then it is morally problematic to deploy AWS with that aim in mind. Second, we then argue that even if it is possible for a sufficiently sophisticated robot to make ‘moral decisions’ that are extensionally indistinguishable from (or better than) human moral decisions, these ‘decisions’ could not be made for the right reasons. This means that the ‘moral decisions’ made by AWS are bound to be morally deficient in at least one respect even if they are extensionally indistinguishable from human ones. Our objections to AWS support the prevalent aversion to the employment of AWS in war. They also enjoy several significant advantages over the most common objections to AWS in the literature.

Other Versions

No versions found

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 99,462

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Analytics

Added to PP
2015-01-30

Downloads
448 (#56,608)

6 months
40 (#106,381)

Historical graph of downloads
How can I increase my downloads?

Author Profiles

Ryan Jenkins
California Polytechnic State University, San Luis Obispo
Duncan Purves
University of Florida
Bradley Strawser
Naval Postgraduate School

Citations of this work

Robots, Law and the Retribution Gap.John Danaher - 2016 - Ethics and Information Technology 18 (4):299–309.
There Is No Techno-Responsibility Gap.Daniel W. Tigard - 2021 - Philosophy and Technology 34 (3):589-607.
Artificial intelligence and responsibility gaps: what is the problem?Peter Königs - 2022 - Ethics and Information Technology 24 (3):1-11.
Responsibility for Killer Robots.Johannes Himmelreich - 2019 - Ethical Theory and Moral Practice 22 (3):731-747.

View all 44 citations / Add more citations

References found in this work

Slaves of the passions.Mark Schroeder - 2007 - New York: Oxford University Press.
Freedom and Resentment.Peter Strawson - 1962 - Proceedings of the British Academy 48:187-211.
The sources of normativity.Christine Marion Korsgaard - 1996 - New York: Cambridge University Press. Edited by Onora O'Neill.

View all 64 references / Add more references