Skip to main content

Advertisement

Log in

Robots, Trust and War

  • Special Issue
  • Published:
Philosophy & Technology Aims and scope Submit manuscript

Abstract

Putting robots on the battlefield is clearly appealing for policymakers. Why risk human lives, when robots could take our place, and do the dirty work of killing and dying for us? Against this, I argue that robots will be unable to win the kind of wars that we are increasingly drawn into. Modern warfare tends towards asymmetric conflict. Asymmetric warfare cannot be won without gaining the trust of the civilian population; this is ‘the hearts and minds’, in the hackneyed phrase from counter-insurgency manuals. I claim that the very feature which makes it attractive to send robots to war in our place, the absence of risk, also makes it peculiarly difficult for humans to trust them. Whatever the attractions, sending robots to war in our stead will make success in counter-insurgency elusive. Moreover, there is ethical reason to be relieved at this conclusion. For if war is potentially costly, then this does much to ensure that it will be a choice only of last resort, in accordance with the traditional doctrine of jus ad bellum. In this instance, morality and expediency— fortunately— coincide.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. I am grateful to an anonymous reviewer who points out that such a stance may tend towards ethical nihilism. Nonetheless, whether it entails this is a more complex matter. Such reflections are certainly a source of what C. A. J. Coady aptly describes as the ‘mood’ of realism, one which is misunderstood, and misunderstands itself, as opposed to all forms of morality (Coady 2008, p. 14). He argues that realism is better understood as an opposed to utopian distortions of morality, distortions which are not attentive to the facts of power and deserve the term ‘moralism’. The fact that the Just War tradition has generally recognised probability of success as a criterion of jus ad bellum admissibility indicates that it too is opposed to moralism.

  2. A similar problem has been noted regarding the perception of ethics within business cultures. ‘[M]oral terms are abandoned because they seem to lack robustness. They suggest ideals and special pleadings without too much organizational weight’ (Bird and Waters 1989, p. 78).

  3. ‘My assessment of your trustworthiness in a particular context is simply my trust of you.’ (Hardin 2002, p. 10).

  4. I am grateful to an anonymous referee for pressing this objection.

  5. I am grateful to Alex Oliver, the editors and two anonymous referees for valuable comments and criticism. This paper was presented at the 7th European Computing and Philosophy conference, Munich. The research was financially supported by Microsoft Research Cambridge.

References

  • Asimov, I. (1996 [1950]). I, Robot. London: HarperVoyager.

  • Baier, A. C. (1994). Trust and antitrust. In C. Annette (Ed.), Baier, moral prejudices: Essays on ethics (pp. 95–129). Cambridge: Harvard University Press.

    Google Scholar 

  • Bird, F., & Waters, J. (1989). The moral muteness of managers. California Management Review, 32(1), 73–88.

    Google Scholar 

  • Blackburn, S. (1998). Trust, cooperation and human psychology. In V. Braithwaite & M. Levi (Eds.), Trust and governance (pp. 28–45). New York: Russell Sage.

    Google Scholar 

  • Coady, C. A. J. (2008). Messy morality: The challenge of politics. Oxford: Clarendon.

    Google Scholar 

  • Crawshaw, Colonel (Retd.) Michael. (2009). ‘The Evolution of British COIN’, in Security and Stabilisation: The Military Contribution. Ministry of Defence: Shrivenham

  • Elster, J. (1979). Ulysses and the Sirens: Studies in rationality and irrationality. Cambridge: Cambridge University Press.

    Google Scholar 

  • FM 3–24 and MCWP 3–33.5. (2006). Counterinsurgency. Department of the Army: Washington, D.C, and Department of the Navy: Washington, D.C.

  • FM 3–24.2. (2009). Tactics in counterinsurgency. Washington: Department of the Army.

    Google Scholar 

  • Guthrie, C., & Quinlan, M. (2007). Just war: Ethics in modern warfare. London: Bloomsbury.

    Google Scholar 

  • Hardin, R. (2002). Trust and trustworthiness. New York: Russell Sage.

    Google Scholar 

  • Hardin, R. (2006). Trust. Cambridge: Polity Press.

    Google Scholar 

  • Hieronymi, P. (2008). The reasons of trust. Australasian Journal of Philosophy, 86(2), 213–236.

    Article  Google Scholar 

  • Hollis, M. (1998). Trust within reason. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Holton, R. (1994). Deciding to trust, coming to believe. Australasian Journal of Philosophy, 72(1), 63–76.

    Article  Google Scholar 

  • Horsburgh, H. J. N. (1960). The ethics of trust. Philosophical Quarterly, 10(41), 343–354.

    Article  Google Scholar 

  • Joint Doctrine Publication 3–40. (2009). Security and stabilisation: The military contribution. Shrivenham: Ministry of Defence.

    Google Scholar 

  • Jones, K. (1996). Trust as an affective attitude. Ethics, 107(1), 4–25.

    Article  Google Scholar 

  • Krulak, C. (1999). The strategic corporal: leadership in the three block war. http://www.au.af.mil/au/awc/awcgate/usmc/strategic_corporal.htm. Accessed 14 October 2010.

  • Lapping, B. (1989). End of empire. London: Paladin.

    Google Scholar 

  • Lin, P., Abney, K., & Bekey, G. (Eds.) (2011). Robot ethics: The ethical and social implications of robotics. Cambridge, MA: MIT Press (in press).

  • Pettit, P. (1995). The cunning of trust. Philosophy & Public Affairs, 24(3), 202–225.

    Article  Google Scholar 

  • Schelling, T. C. (1960). The strategy of conflict. Cambridge: Harvard University Press.

    Google Scholar 

  • Sharkey, N. (2008). Grounds for discrimination: Autonomous robot weapons. RUSI Defence Systems, 11(2), 86–89.

    Google Scholar 

  • Singer, P. (2009). Wired for war: The robotics revolution and conflict in the 21st century. London: Penguin.

    Google Scholar 

  • Smith, R. (2005). The utility of force: The art of war in the modern world. London: Allen Lane.

    Google Scholar 

  • Sparrow, R. (2007). Killer robots. Journal of Applied Philosophy, 24(1), 62–77.

    Article  Google Scholar 

  • Sullins, J. (2010). RoboWarfare: Can robots be more ethical than humans on the battlefield? Ethics and Information Technology, 12(3), 263–275.

    Article  Google Scholar 

  • Thomas, D. O. (1978). The duty to trust. Proceedings of the Aristotelian Society, n.s. 79, 89–101.

  • US Department of Defense. (2009). FY2009-2034 Unmanned Systems Integrated Roadmap. http://www.jointrobotics.com/documents/library/UMS%20Integrated%20Roadmap%202009.pdf. Accessed 14 October 2010.

  • Wiener, N. (1961). Cybernetics: or Control and communication in the animal and machine (2nd ed.). Cambridge: MIT Press.

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thomas W. Simpson.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Simpson, T.W. Robots, Trust and War. Philos. Technol. 24, 325–337 (2011). https://doi.org/10.1007/s13347-011-0030-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13347-011-0030-y

Keywords

Navigation