Intending to err: the ethical challenge of lethal, autonomous systems [Book Review]

Ethics and Information Technology 14 (4):241-254 (2012)
Abstract This article has no associated abstract. (fix it)
Keywords Biometrics   Lethal autonomy   Right intention   Tolerance for error
Categories (categorize this paper)
Options
 Save to my reading list
Follow the author(s)
My bibliography
Export citation
Find it on Scholar
Edit this record
Mark as duplicate
Revision history Request removal from index
 
Download options
PhilPapers Archive


Upload a copy of this paper     Check publisher's policy on self-archival     Papers currently archived: 10,612
External links
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library
References found in this work BETA
John P. Sullins (2006). When is a Robot a Moral Agent. International Review of Information Ethics 6 (12):23-30.
Citations of this work BETA

No citations found.

Similar books and articles
Randolph Clarke (2008). Autonomous Reasons for Intending. Australasian Journal of Philosophy 86 (2):191 – 212.
Lubomira Radoilska (2013). Autonomy and Depression. In K. W. M. Fulford, Martin Davis, George Graham, John Sadler, Giovanni Stanghellini & Tim Thornton (eds.), Oxford Handbook of Philosophy and Psychiatry. Oxford University Press. 1155-1170.
Wayne A. Davis (1984). A Causal Theory of Intending. American Philosophical Quarterly 21 (1):43-54.
Ric Caric Northrup (1994). Identity, Social Relations, and Time. Philosophy in the Contemporary World 1 (1):26-33.
Analytics

Monthly downloads

Sorry, there are not enough data points to plot this chart.

Added to index

2012-10-07

Total downloads

1 ( #432,867 of 1,098,427 )

Recent downloads (6 months)

1 ( #285,057 of 1,098,427 )

How can I increase my downloads?

My notes
Sign in to use this feature


Discussion
Start a new thread
Order:
There  are no threads in this forum
Nothing in this forum yet.