Switch to: Citations

Add references

You must login to add references.
  1. Robots and Respect: Assessing the Case Against Autonomous Weapon Systems.Robert Sparrow - 2016 - Ethics and International Affairs 30 (1):93-116.
    There is increasing speculation within military and policy circles that the future of armed conflict is likely to include extensive deployment of robots designed to identify targets and destroy them without the direct oversight of a human operator. My aim in this paper is twofold. First, I will argue that the ethical case for allowing autonomous targeting, at least in specific restricted domains, is stronger than critics have acknowledged. Second, I will attempt to uncover, explicate, and defend the intuition that (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   29 citations  
  • Killer robots.Robert Sparrow - 2007 - Journal of Applied Philosophy 24 (1):62–77.
    The United States Army’s Future Combat Systems Project, which aims to manufacture a “robot army” to be ready for deployment by 2012, is only the latest and most dramatic example of military interest in the use of artificially intelligent systems in modern warfare. This paper considers the ethics of a decision to send artificially intelligent robots into war, by asking who we should hold responsible when an autonomous weapon system is involved in an atrocity of the sort that would normally (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   209 citations  
  • Autonomous Machines, Moral Judgment, and Acting for the Right Reasons.Duncan Purves, Ryan Jenkins & Bradley J. Strawser - 2015 - Ethical Theory and Moral Practice 18 (4):851-872.
    We propose that the prevalent moral aversion to AWS is supported by a pair of compelling objections. First, we argue that even a sophisticated robot is not the kind of thing that is capable of replicating human moral judgment. This conclusion follows if human moral judgment is not codifiable, i.e., it cannot be captured by a list of rules. Moral judgment requires either the ability to engage in wide reflective equilibrium, the ability to perceive certain facts as moral considerations, moral (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   39 citations  
  • Off her trolley? Frances Kamm and the metaphysics of morality.Alastair Norcross - 2008 - Utilitas 20 (1):65-80.
    Frances Kamm's aptly titled Intricate Ethics is a tour de force of what Peter Unger calls the ‘preservationist’ approach to ethical theory. Here is some of what she says about her methodology: Consider as many case-based judgments of yours as prove necessary. Do not ignore some case-based judgments, assuming they are errors, just because they conflict with simple or intuitively plausible principles that account for some subset of your case-based judgments. Work on the assumption that a different principle can account (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   25 citations  
  • The responsibility gap: Ascribing responsibility for the actions of learning automata. [REVIEW]Andreas Matthias - 2004 - Ethics and Information Technology 6 (3):175-183.
    Traditionally, the manufacturer/operator of a machine is held (morally and legally) responsible for the consequences of its operation. Autonomous, learning machines, based on neural networks, genetic algorithms and agent architectures, create a new situation, where the manufacturer/operator of the machine is in principle not capable of predicting the future machine behaviour any more, and thus cannot be held morally responsible or liable for it. The society must decide between not using this kind of machine any more (which is not a (...)
    Direct download (11 more)  
     
    Export citation  
     
    Bookmark   167 citations  
  • The responsibility gap: Ascribing responsibility for the actions of learning automata.Andreas Matthias - 2004 - Ethics and Information Technology 6 (3):175-183.
    Traditionally, the manufacturer/operator of a machine is held (morally and legally) responsible for the consequences of its operation. Autonomous, learning machines, based on neural networks, genetic algorithms and agent architectures, create a new situation, where the manufacturer/operator of the machine is in principle not capable of predicting the future machine behaviour any more, and thus cannot be held morally responsible or liable for it. The society must decide between not using this kind of machine any more (which is not a (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   169 citations  
  • War and massacre.Thomas Nagel - 1972 - Philosophy and Public Affairs 1 (2):123-144.
    From the apathetic reaction to atrocities committed in Vietnam by the United States and its allies, one may conclude that moral restrictions on the conduct of war command almost as little sympathy among the general public as they do among those charged with the formation of U.S. military policy. Even when restrictions on the conduct of warfare are defended, it is usually on legal grounds alone: their moral basis is often poorly understood. I wish to argue that certain restrictions are (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   103 citations