AI As a Moral Right-Holder
In Markus Dubber, Frank Pasquale & Sunit Das (eds.), The Oxford Handbook of Ethics of AI. New York: Oxford University Press (2020)
AbstractThis chapter evaluates whether AI systems are or will be rights-holders, explaining the conditions under which people should recognize AI systems as rights-holders. It develops a skeptical stance toward the idea that current forms of artificial intelligence are holders of moral rights, beginning with an articulation of one of the most prominent and most plausible theories of moral rights: the Interest Theory of rights. On the Interest Theory, AI systems will be rights-holders only if they have interests or a well-being. Current AI systems are not bearers of well-being, and so fail to meet the necessary condition for being rights-holders. This argument is robust against a range of different objections. However, the chapter also shows why difficulties in assessing whether future AI systems might have interests or be bearers of well-being—and so be rights-holders—raise difficult ethical challenges for certain developments in AI.
Similar books and articles
In defense of the jurisdiction theory of rights.Eric Mack - 2000 - The Journal of Ethics 4 (1-2):71-98.
Rights bearers and rights functions.Anna-Karin Margareta Andersson - 2015 - Philosophical Studies 172 (6):1625-1646.
Capacity, claims and children's rights.Mhairi Cowden - 2012 - Contemporary Political Theory 11 (4):362-380.
Toward a Coherent Theory of Moral Rights.Derrick Lamont Darby - 1996 - Dissertation, University of Pittsburgh
Raz on Rights: Human Rights, Fundamental Rights, and Balancing.Aleardo Zanghellini - 2017 - Ratio Juris 30 (1):25-40.
Hobbes's theory of rights – a modern interest theory.Eleanor Curran - 2002 - The Journal of Ethics 6 (1):63-86.
Reconciling feminist politics and feminist ethics on the issue of rights.Samantha Brennan - 1999 - Journal of Social Philosophy 30 (2):260–275.
Added to PP
Historical graph of downloads
Sorry, there are not enough data points to plot this chart.
Citations of this work
Empathic responses and moral status for social robots: an argument in favor of robot patienthood based on K. E. Løgstrup.Simon N. Balle - 2022 - AI and Society 37 (2):535-548.
Nonhuman Value: A Survey of the Intrinsic Valuation of Natural and Artificial Nonhuman Entities.Andrea Owe, Seth D. Baum & Mark Coeckelbergh - 2022 - Science and Engineering Ethics 28 (5):1-29.
References found in this work
No references found.