Skip to main content

Advertisement

Log in

Building a Better WarBot: Ethical Issues in the Design of Unmanned Systems for Military Applications

  • Original Paper
  • Published:
Science and Engineering Ethics Aims and scope Submit manuscript

Abstract

Unmanned systems in military applications will often play a role in determining the success or failure of combat missions and thus in determining who lives and dies in times of war. Designers of UMS must therefore consider ethical, as well as operational, requirements and limits when developing UMS. I group the ethical issues involved in UMS design under two broad headings, Building Safe Systems and Designing for the Law of Armed Conflict, and identify and discuss a number of issues under each of these headings. As well as identifying issues, I offer some analysis of their implications and how they might be addressed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. The research for this paper was supported by the Australian Research Council, through the award of an ARC Discovery Grant to Dr. Jessica Wolfendale, Professor Tony Coady, and Dr. Robert Sparrow. I would like to thank Neil McKinnon for assistance with locating sources for this paper and with preparing it for publication. I would also like to thank Jessica Wolfendale, Linda Barclay, Jim Sparrow, John Canning, and Ron Arkin for comments and discussion which have improved this paper.

  2. It would be preferable to maintain a terminological distinction between “uninhabited” systems, which include a remote pilot or human operator, and “unmanned” systems, which do away with human involvement altogether (Foster 2006). The vast majority of existing systems are uninhabited rather than unmanned. Describing these systems as “uninhabited” also has the advantage of avoiding any implication about the gender of the warfighters such systems displace or replace. However, the literature on robotic weapons systems, with a few exceptions, primarily refers to both these sorts of systems as unmanned and, for the sake of consistency, this is the approach I will adopt also.

  3. This paper is but a small portion of a much larger manuscript on the ethical issues raised by the development of robotic weapons systems, funded by the Australian research Council via the ARC Discovery Grant mentioned in footnote 1 above.

  4. It may not prove a welcome recognition amongst many of the audience for whom this paper is written but there is clearly a question about the ethics of computer scientists and engineers choosing to work on military applications at all. There are many more urgent human needs than any that will be met by military robots (Anonymous 2007). In the context of contemporary global inequality, widespread poverty, and looming environmental catastrophe, the amount of human ingenuity, labour, and physical resources devoted to the military is, frankly, morally obscene. It is also a live question whether the objective of achieving national security or a just peace could not be better served by non-military spending (Baard 2003). Researchers face a personal choice about whether they wish to participate in and contribute to such an unsatisfactory state of affairs. Whether they should speak out against their members working for the military is also an important question for professional associations of engineers, computer scientists, and roboticists (Baard 2003). However, this question arises in regards to research and development of military technologies more generally and is by no means unique to research in robotics (Mitcham 1989); consequently, I will not discuss it here.

  5. Wall (2006) cites a Major Paul Tombleson, from the British army, testifying that the lives of British troops have already been endangered trying to recover crashed Pioneer UAVs.

  6. At the time of writing, this danger looms large in relation to US operations involving the use of Predator drones to attack targets inside of Pakistan. It seems highly unlikely that missile strikes alone will be capable of producing “victory” in the struggle against the Taliban (and their allies) in this region. However, the policy of carrying out such strikes involves a significant political risk of exacerbating anti-US sentiment in the region and thus increases the risk that the US will need to commit further (manned) forces to this conflict.

  7. The risk that the enemy may gain intelligence from a captured system may be reduced by building proper “anti-tamper” measures into the system. My thanks to John Canning for drawing my attention to this issue.

  8. A particularly important limitation of such systems is their inability to provide emergency medical care to wounded warfighters and/or civilians where required.

  9. See Sparrow (2007) for further discussion.

  10. A search for “Predator” AND “ UAV” on YouTube on 27.6.07 produced at least four distinct pieces of footage purportedly taken by Predator UAVs.

  11. It must also be acknowledged that military socialisation prior to combat may play a large role in producing these emotions. Whether this will affect the operators of UMS depends on the type of training they receive.

  12. It is also worth observing that locating the operators of UMS in the nation in which a conflict is occurring, but confining them to a military base that is entirely isolated from the local culture, will not address the issue identified here. If security and/or other operational reasons mandate isolating military personnel from the local culture then it will make no difference whether the operators are isolated at a military base in their own country or isolated at a forward operating base.

  13. It might also go some way towards reducing the stress associated with fighting a war in a country in which one has never set foot, discussed above.

  14. This fact also has significant implications for the ethics of the return of fire from such systems when they come under attack. Klein (2003) suggests that UAVs and UCAVs need to be classified as “national assets” in order to ground a right to defend these systems if they come under enemy fire; presumably the same would be true of UGVs, UUVs, and USVs as well. While finessing the legal nomenclature may establish the legal right to fire upon enemy combatants who attack UMS, the fact that the life of the operator of UMS is not threatened when the system is attacked undercuts the moral justification for returning fire directed against them—at least insofar as this justification proceeds from the right to self-defence.

  15. Because the communication infrastructure required to keep a “human in the loop” is a weakpoint in UMS, systems which dispense with a human operator will be more survivable. As the tempo of battle increases as a result of technological developments, including the development of UMS, systems which rely on human input may be at a substantial disadvantage in combat against fully autonomous systems. There is therefore a substantial incentive for designers of UMS to provide systems with a capacity for autonomous operations (Adams and Thomas 2001; Blackmore 2005; Excell 2007; Featherstone 2007; Lerner 2006; Szafranski 2005).

References

  • Adams, T. K. (2001). Future warfare and the decline of human decision-making. Parameters: US Army War College Quarterly (Winter, 2001–2002), 57–71.

  • Anonymous. (2007, February 7). Defense sector accused of muscling in on robots. Professional Engineering, 20(3), 8.

  • Arkin, R. C. (2007). Governing lethal behaviour: Embedding ethics in a hybrid deliberative/reactive robot architecture. Technical Report GIT-GVU-07-11 for US Army. Mobile Robot Laboratory, College of Computing, Georgia Institute of Technology. Retrieved October 25, 2007, from http://www.cc.gatech.edu/ai/robot-lab/online-publications/formalizationv35.pdf.

  • Asaro, P. M. (2007). Robots and responsibility from a legal prospective. IEEE International Conference on Robotics and Automation, Roma, Italy.

  • Baard, E. (2003). Make robots not war: Some scientists refuse to get paid for killer ideas. Village Voice (Greenwich Village, New York, N.Y.) (September), 39–44.

  • Barry, C. L., & Zimet, E. (2001). UCAVs—Technological, policy, and operational challenges. Defense Horizons (3). Retrieved November 19, 2008, from http://www.ndu.edu/inss/DefHor/DH3/HD_03.pdf.

  • Bender, B. (2005). Attacking Iraq, from a Nev. Computer. Boston Globe, April 3, A6.

  • Blackmore, T. (2005). Dead slow: Unmanned aerial vehicles loitering in battlespace. Bulletin of Science, Technology & Society, 25(3), 195–214. doi:10.1177/0270467605276097.

    Article  Google Scholar 

  • Boland, R. (2007). Developing reasoning robots for today and tomorrow. Signal, 61(6), 43–46.

    Google Scholar 

  • Butler, A. (2007). Global Hawk UAV supports border Ops in Iraq. Aviation Week & Space Technology, 166(11), 56.

    Google Scholar 

  • Canning, J. S. (2005). A definitive work on factors impacting the arming of unmanned vehicles. Washington, DC: Dahlgren Division Naval Surface Warfare Centre.

    Google Scholar 

  • Canning, J. S. (2006). A concept of operations for armed autonomous systems. In 3rd Annual Disruptive Technology Conference, September 6–7, Washington, DC.

  • Card, J. (2007). Killer machines. Foreign Policy (May/June), 92.

  • Chapman, Col. R. E. (2002). Unmanned combat aerial vehicles: Dawn of a new age? Aerospace Power Journal, 16(2), 60–73.

    Google Scholar 

  • Cummings, M. L. (2004). Creating moral buffers in weapon control interface design. IEEE Technology and Society Magazine, 23(3), 28–33, 41. doi:10.1109/MTAS.2004.1337888.

    Article  Google Scholar 

  • Cummings, M. L. (2006). Automation and accountability in decision support system interface design. Journal of Technology Studies, 32(1), 23–31.

    Google Scholar 

  • Dalton, J. G. (2006). Future navies—present issues. Naval War College Review, 59, 17–39.

    Google Scholar 

  • Donnelly, S. B. (2005, December 4). Long-distance warriors. Time Magazine.

  • Dunlap, C. J. Jr. (1999, Autumn). Technology: Recomplicating moral life for the nation’s defenders. Parameters: US Army War College Quarterly, 24–53.

  • Excell, J. (2007, January 29). Unmanned aircraft: Out of the shadows. The Engineer, 18.

  • Featherstone, S. (2007). The coming robot army. Harper’s Magazine, 43–52.

  • Fielding, M. (2006). Robotics in future land warfare. Australian Army Journal, 3(2), 1–10.

    Google Scholar 

  • Fitzsimonds, J. R., & Mahnken, T. G. (2007). Military officer attitudes toward UAV adoption: Exploring institutional impediments to innovation. Joint Force Quarterly, 46, 96–103.

    Google Scholar 

  • Fulghum, D. A. (2003). Predator’s progress. Aviation Week & Space Technology, 158(9), 48–50.

    Google Scholar 

  • Foster, Lt Col. J. (2006, Spring). Ricochets and replies: First rule of modern warfare. Air and Space Power Journal. Retrieved November 19, 2008, from http://www.airpower.maxwell.af.mil/airchronicles/apj/apj06/spr06/ricspr06.html.

  • Graham, S. (2006). America’s robot army. New Statesman (London, England), 135(4796), 12–15.

    Google Scholar 

  • Gulam, H., & Lee, S. W. (2006). Uninhabited combat aerial vehicles and the law of armed conflict. Australian Army Journal, 3(2), 1–14.

    Google Scholar 

  • Hambling, D. (2007, January 23). Military builds robotic insects. Wired Magazine. Retrieved from http://www.wired.com/science/discoveries/news/2007/01/72543.

  • Hanley, C. J. (2007, July 16). Robot-aircraft attack squadron bound for Iraq. Aviation. Retrieved November 19, 2008, from http://new.aviation.com/ap_070716_reapergoestoiraq.html.

  • Hockmuth, C. M. (2007, February). UAVs—The next generation. Air Force Magazine, 70–74.

  • Kainikara, S. (2002). UCAVs probable lynchpins of future air warfare. Asia-Pacific Defence Reporter, 28(6), 42–45.

    Google Scholar 

  • Kaplan, R. D. (2006, August). Hunting the Taliban in Las Vegas. Atlantic Monthly, 4.

  • Kenyon, H. S. (2006). Israel deploys robot guardians. Signal, 60(7), 41–44.

    Google Scholar 

  • Kenyon, H. S. (2007). Airborne testbed opens new possibilities. Signal, 61(9), 47–49.

    Google Scholar 

  • Klein, J. J. (2003, July 22). The problematic Nexus: Where unmanned combat air vehicles and the law of armed conflict meet. Air and Space Power Journal Chronicles Online. Retrieved March 27, 2007, from http://www.airpower.maxwell.af.mil/airchronicles/cc/klein.html.

  • Kochan, A. (2005). Automation in the sky. The Industrial Robot, 32(6), 468–471. doi:10.1108/01439910510629181.

    Article  Google Scholar 

  • Lazarski, A. J. (2002). Legal implications of the uninhabited combat aerial vehicle. Air and Space Power Journal, 16(2), 74–83.

    Google Scholar 

  • Legien, W., Andersson, C.-J., & Hansen, G. (2006). UUV and USV: Which ‘unmanned’ for what task? Naval Forces, 27(3), 44–51.

    Google Scholar 

  • Lerner, P. (2006). Robots go to war. Popular Science, 268(1).

  • McMains, J. W. (2004). The Marine Corps robotics revolution. The Marine Corps Gazette, 88(1), 34–37.

    Google Scholar 

  • Marino, D., & Tamburrini, G. (2006). Learning robots and human responsibility. International Review of Information Ethics, 6, 47–51.

    Google Scholar 

  • Marks, P. (2006). Armchair warfare. New Scientist, 192(2575), 24. doi:10.1016/S0262-4079(06)60832-4.

    Article  Google Scholar 

  • Masey, J. P. (2006). Towards an unmanned navy: The growing world of UAV, UCAV, USV and UUV. Naval Forces, 27(6), 23–31.

    Google Scholar 

  • Mitcham, C. (1989). The spectrum of ethical issues associated with the military support of science and technology. In C. Mitcham & P. Siekevitz (Eds.), Ethical issues associated with scientific and technological research for the military, Vol. 577 of Annals of the New York Academy of Sciences.

  • Mustin, J. (2002). Future employment of unmanned aerial vehicles. Air and Space Power Journal, 16(2), 86–97.

    Google Scholar 

  • Office of the Secretary of Defense. (2005a). Joint Robotics Program Master Plan FY2005: Out front in harm’s way. Washington DC: Office of the Undersecretary of Defense (AT&L) Defense Systems/Land Warfare and Munitions.

  • Office of the Secretary of Defense. (2005b). Unmanned Aircraft Systems Roadmap: 2005–2030. Washington DC: Department of Defense, United States Government.

  • Office of the Under Secretary of Defense. (2006). Development and utilisation of robotics and unmanned ground vehicles: Report to congress. Washington DC: Office of the Under Secretary of Defense, Acquisition, Technology and Logistics, Portfolio Systems Acquisition, Land Warfare and Munitions, Joint Ground Robotics Enterprise.

  • Peterson, G. I. (2005). Unmanned vehicles: Changing the way to look at the battlespace. Naval Forces, 26(4), 29–38.

    Google Scholar 

  • Royal Australian Air Force. (2004). AAP 1003—Operations Law for RAAF Commanders (2nd ed.). Canberra: Air Power Development Centre.

  • Scarborough, R. (2005, May 8). Special Report: Unmanned Warfare. Washington Times.

  • Schmitt, M. N. (2005). Precision attack and international humanitarian law. International Review of the Red Cross, 87(859), 445–466.

    Article  Google Scholar 

  • Shachtman, N. (2005a). Attack of the drones. Wired Magazine, 13(6). Retrieved August 25, 2005 from http://www.wired.com/wired/archive//13.06/drones_pr.html.

  • Shachtman, N. (2005b, May 27). Drone school, a ground’s-eye view. Wired Magazine. Retrieved September 17, 2007, from http://www.wired.com/science/discoveries/news/2005/05/67655.

  • Sherman, J. (2005). The drone wars. The Bulletin of the Atomic Scientists, 61(5), 28–37. doi:10.2968/061005011.

    Article  Google Scholar 

  • Shurtleff, D. K. (2002). The effects of technology on our Humanity. Parameters: US Army War College Quarterly (Summer), 100–112.

  • Sparrow, R. (2007). Killer robots. Journal of Applied Philosophy, 24(1), 62–77. doi:10.1111/j.1468-5930.2007.00346.x.

    Google Scholar 

  • Sullivan, J. M. (2006). Evolution or revolution? The rise of UAVs. IEEE Technology and Society Magazine, 25(3), 43–49. doi:10.1109/MTAS.2006.1700021.

    Article  Google Scholar 

  • Szafranski, Col. R. (2005). The first rule of modern warfare: Never bring a knife to a gunfight. Air and Space Power Journal, 19(4), 19–26.

    Google Scholar 

  • Thorton, Jr., Captain, R. L. (2005). The case for robots in the SBCTs (Now). Infantry, 94(1), 33–41.

  • Ulin, D. L. (2005, January 30). When robots do the killing. Los Angeles Times.

  • Veruggio, G. (2006). EURON roboethics roadmap. Genoa: European Robotics Research Network.

    Google Scholar 

  • Veruggio, G., & Operto, F. (2006). Roboethics: Social and ethical implications of robotics. Springer Handbook of Robotics (pp. 1499–1524). Berlin: Springer.

    Google Scholar 

  • Wall, R. (2006). Reality check. Aviation Week & Space Technology, 165(5), 33.

    Google Scholar 

  • Wise, J. (2007). No pilot, no problem. Popular Mechanics, 184(4), 64.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Robert Sparrow.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sparrow, R. Building a Better WarBot: Ethical Issues in the Design of Unmanned Systems for Military Applications. Sci Eng Ethics 15, 169–187 (2009). https://doi.org/10.1007/s11948-008-9107-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11948-008-9107-0

Keywords

Navigation