The responsibility gap: Ascribing responsibility for the actions of learning automata

Ethics and Information Technology 6 (3):175-183 (2004)
  Copy   BIBTEX

Abstract

Traditionally, the manufacturer/operator of a machine is held (morally and legally) responsible for the consequences of its operation. Autonomous, learning machines, based on neural networks, genetic algorithms and agent architectures, create a new situation, where the manufacturer/operator of the machine is in principle not capable of predicting the future machine behaviour any more, and thus cannot be held morally responsible or liable for it. The society must decide between not using this kind of machine any more (which is not a realistic option), or facing a responsibility gap, which cannot be bridged by traditional concepts of responsibility ascription.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 93,296

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Artificial intelligence and responsibility gaps: what is the problem?Peter Königs - 2022 - Ethics and Information Technology 24 (3):1-11.
Introduction.Charles Ess - 2002 - Ethics and Information Technology 4 (3):177-188.
Editorial.Lucas D. Introna - 2001 - Ethics and Information Technology 3 (3):155-156.
Announcements.Herman T. Tavani - 2000 - Ethics and Information Technology 2 (4):251-255.

Analytics

Added to PP
2016-06-02

Downloads
33 (#500,650)

6 months
111 (#42,964)

Historical graph of downloads
How can I increase my downloads?