Sharing Moral Responsibility with Robots: A Pragmatic Approach.

In Holst, Per Kreuger & Peter Funk (eds.), Frontiers in Artificial Intelligence and Applications Volume 173. IOS Press Books (2008)

Gordana Dodig Crnkovic
Chalmers University of Technology
Roboethics is a recently developed field of applied ethics which deals with the ethical aspects of technologies such as robots, ambient intelligence, direct neural interfaces and invasive nano-devices and intelligent soft bots. In this article we look specifically at the issue of (moral) responsibility in artificial intelligent systems. We argue for a pragmatic approach, where responsibility is seen as a social regulatory mechanism. We claim that having a system which takes care of certain tasks intelligently, learning from experience and making autonomous decisions gives us reasons to talk about a system (an artifact) as being “responsible” for a task. No doubt, technology is morally significant for humans, so the “responsibility for a task” with moral consequences could be seen as moral responsibility. Intelligent systems can be seen as parts of socio-technological systems with distributed responsibilities, where responsible (moral) agency is a matter of degree. Knowing that all possible abnormal conditions of a system operation can never be predicted, and no system can ever be tested for all possible situations of its use, the responsibility of a producer is to assure proper functioning of a system under reasonably foreseeable circumstances. Additional safety measures must however be in place in order to mitigate the consequences of an accident. The socio-technological system aimed at assuring a beneficial deployment of intelligent systems has several functional responsibility feedback loops which must function properly: the awareness and procedures for handling of risks and responsibilities on the side of designers, producers, implementers and maintenance personnel as well as the understanding of society at large of the values and dangers of intelligent technology. The basic precondition for developing of this socio-technological control system is education of engineers in ethics and keeping alive the democratic debate on the preferences about future society.
Keywords Roboethics  Robothic Ethics  Accountabilty  Responsibility
Categories (categorize this paper)
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Translate to english
Revision history

Download options

Our Archive

Upload a copy of this paper     Check publisher's policy     Papers currently archived: 46,509
External links

Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library

References found in this work BETA

No references found.

Add more references

Citations of this work BETA

On the Moral Responsibility of Military Robots.Thomas Hellström - 2013 - Ethics and Information Technology 15 (2):99-107.
Robots: Ethical by Design.Gordana Dodig Crnkovic & Baran Çürüklü - 2012 - Ethics and Information Technology 14 (1):61-71.

Add more citations

Similar books and articles

The Moral Responsibility of the Hospital.Richard T. De George - 1982 - Journal of Medicine and Philosophy 7 (1):87-100.
Moral Responsibility for Harm Caused by Computer System Failures.Douglas Birsch - 2004 - Ethics and Information Technology 6 (4):233-245.
Technological Delegation: Responsibility for the Unintended.Katinka Waelbers - 2009 - Science and Engineering Ethics 15 (1):51-68.
Responsibility.Garrath Williams - 2006 - Internet Encyclopedia of Philosophy.
Robotrust and Legal Responsibility.Ugo Pagallo - 2010 - Knowledge, Technology & Policy 23 (3):367-379.


Added to PP index

Total views
69 ( #127,301 of 2,286,505 )

Recent downloads (6 months)
9 ( #107,052 of 2,286,505 )

How can I increase my downloads?


My notes

Sign in to use this feature