Skip to main content
Log in

RoboWarfare: can robots be more ethical than humans on the battlefield?

  • Published:
Ethics and Information Technology Aims and scope Submit manuscript

…And we are here as on a darkling plain

Swept with confused alarms of struggle and flight,

Where ignorant armies clash by night.

—Dover Beach, by Matthew Arnold

Abstract

Telerobotically operated and semiautonomous machines have become a major component in the arsenals of industrial nations around the world. By the year 2015 the United States military plans to have one-third of their combat aircraft and ground vehicles robotically controlled. Although there are many reasons for the use of robots on the battlefield, perhaps one of the most interesting assertions are that these machines, if properly designed and used, will result in a more just and ethical implementation of warfare. This paper will focus on these claims by looking at what has been discovered about the capability of humans to behave ethically on the battlefield, and then comparing those findings with the claims made by robotics researchers that their machines are able to behave more ethically on the battlefield than human soldiers. Throughout the paper we will explore the philosophical critique of this claim and also look at how the robots of today are impacting our ability to fight wars in a just manner.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. See, for a complete overview of this technology Singer (2009).

  2. Singer (2009, p. 125).

  3. SPAWAR is the acronym for Space and Naval Warfare Systems Center—Pacific, located in San Diego California, USA.

  4. The Center for Ethics in Science and Technology. RoboWarfare. Is the world a better place when robots fight our wars for us? A public forum with some information archived at: http://www.ethicscenter.net/event/robowarfare-world-better-place-when-robots-fight-our-wars-us.

  5. Arkin (2007) and Lin et al. (2008).

  6. Storrs Hall (2007, pp. 339–442).

  7. Arkin (2007, p. 7).

  8. Lin et al. (2008).

  9. See, Hickman (1990) for a more detailed argument.

  10. Idem, p. 111.

  11. Sullins (2009a, p. 209).

  12. Idem, p. 210.

  13. Anderson and Anderson (2004), Arkin (2007, 2009), Floridi and Sanders (2004), and Wallach and Allen (2009).

  14. Sullins (2009b & forthcoming).

  15. Walzer (1977).

  16. Idem, p. 305.

  17. Surgeon General’s Office (2006, pp. 34–43).

  18. Idem, p. 34.

  19. Ibid.

  20. Idem, p 35.

  21. Ibid.

  22. Ibid.

  23. Idem, p. 36.

  24. Ibid.

  25. Ibid.

  26. Idem, p. 37.

  27. Idem, pp. 38–39.

  28. MacFarquhar (2009).

  29. See Singer (2009, p. 315) and Cox (2006), for a more detailed report on this phenomenon.

  30. Cox (2006).

  31. Surgeon General’s Office (2006).

  32. Pike (2009).

  33. Ibid.

  34. Ibid.

  35. For more detailed analysis see: Sullins (2009b) and Sullins, forthcoming.

  36. See, Singer (2009), for more detailed case studies.

  37. Sullins (2009b, p. 21 & forthcoming).

  38. Sullins (2009b, p. 19).

  39. Ditz (2009a, b).

  40. Mir (2009).

  41. Saber (2009).

  42. Human Rights Watch (2009).

  43. Ibid.

  44. Silverstein ( 2009 ).

  45. Some CIA drone operators have even expressed concern themselves that they are being asked to fly missions that may seem politically expedient but are actually hurting the U.S. effort to subdue the influence of Taliban forces in Pakistan and Afghanistan, see Porter (June 3, 2010).

  46. For a more complete analysis please see, Sullins.

  47. See Dyer (2009) and Singer (2009) for evidence of this practice.

  48. Singer (2009) devotes an entire chapter to this subject (Ch 6).

  49. Singer (2009, p. 125).

  50. Ibid.

  51. Shachtman (2007).

  52. Wallach and Allen (2009, p. 71).

  53. Ibid.

  54. Idem, p. 42.

  55. Lin et al. (2008, p. 53).

  56. Arkin (2007, 2009, p. 19).

  57. Arkin (2007, 2009, p. 20).

  58. New York Times (November, 2008).

  59. Arkin (2009).

  60. Ibid.

  61. Ibid.

  62. Ibid.

  63. Idem, p. 23.

  64. Bar-Cohen and Hansen (2009, p. 153).

  65. Storrs Hall (2007, p. 344).

  66. Idem, p. 346.

  67. Kurzweil (2006, p. 389).

  68. Storrs Hall (2007, p. 355).

  69. Arkin (2007).

  70. Lin et al. (2008, p. 37).

  71. This was quoted in: Simonite (2009).

  72. Lin et al. (2008, pp. 37–38).

  73. For a more complete analysis of this process see Sullins (2007).

  74. Sparrow (2007).

  75. Idem, p. 71.

  76. Sullins (2006).

  77. Supra, p. 73.

  78. Ibid.

  79. Beavers (2009).

  80. Ibid.

  81. Mitri et al. (2009).

Abbreviations

AI:

Artificial intelligence

LOW:

Law of war

ROE:

Rules of engagement

References

Download references

Acknowledgments

I would like to thank all those who read early drafts of this work and provided helpful comments. I would also like to thank Noam Cook whom, many years ago, drew my attention to the poem by Matthew Arnold that begins this paper and inspired my initial forays into the topic of automated weaponry.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to John P. Sullins.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sullins, J.P. RoboWarfare: can robots be more ethical than humans on the battlefield?. Ethics Inf Technol 12, 263–275 (2010). https://doi.org/10.1007/s10676-010-9241-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10676-010-9241-7

Keywords

Navigation