Ethics and Information Technology 12 (3):235-241 (2010)

Authors
Mark Coeckelbergh
University of Vienna
Abstract
Can we build ‘moral robots’? If morality depends on emotions, the answer seems negative. Current robots do not meet standard necessary conditions for having emotions: they lack consciousness, mental states, and feelings. Moreover, it is not even clear how we might ever establish whether robots satisfy these conditions. Thus, at most, robots could be programmed to follow rules, but it would seem that such ‘psychopathic’ robots would be dangerous since they would lack full moral agency. However, I will argue that in the future we might nevertheless be able to build quasi-moral robots that can learn to create the appearance of emotions and the appearance of being fully moral. I will also argue that this way of drawing robots into our social-moral world is less problematic than it might first seem, since human morality also relies on such appearances.
Keywords Appearance   Emotions   Feelings   Human morality   Mental states   Robot morality   Rule-following
Categories (categorize this paper)
DOI 10.1007/s10676-010-9221-y
Options
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

PhilArchive copy


Upload a copy of this paper     Check publisher's policy     Papers currently archived: 60,842
Through your library

References found in this work BETA

Computing Machinery and Intelligence.Alan M. Turing - 1950 - Mind 59 (October):433-60.

View all 18 references / Add more references

Citations of this work BETA

Mind the Gap: Responsible Robotics and the Problem of Responsibility.David J. Gunkel - 2020 - Ethics and Information Technology 22 (4):307-320.

View all 26 citations / Add more citations

Similar books and articles

Analytics

Added to PP index
2010-03-22

Total views
352 ( #24,080 of 2,438,918 )

Recent downloads (6 months)
10 ( #64,504 of 2,438,918 )

How can I increase my downloads?

Downloads

My notes