Skip to main content

AI, Ethics, and Design: Revisiting the Trolley Problem

  • Chapter
  • First Online:

Abstract

This chapter critiques the use of the trolley problem—a well-known ethics thought experiment that highlights the limitations of utilitarian ethics—and its frequent application in discussions of autonomous vehicle safety. It introduces other approaches that include “moral crumple zones” (Elish and Hwang), “wicked problems” (Rittel and Webber), and gradations of system control (Parasuraman, Sheridan, and Wickens), and considers the ethical issues of Department of Defense funding in tech companies. It suggests that approaches from design might help to better engender the dynamics at play in computational technologies and frame their ethical implications.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   299.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    Foot, “The Problem of Abortion and the Doctrine of Double Effect,” 1. Page numbers refer to republished PDF: https://philpapers.org/archive/FOOTPO-2.pdf

  2. 2.

    Foot, “The Problem of Abortion and the Doctrine of Double Effect,” 4.

  3. 3.

    Ibid., 5.

  4. 4.

    Thomson, “The Trolley Problem,” 1404.

  5. 5.

    Ibid., 1409.

  6. 6.

    Ibid.

  7. 7.

    Velasquez, Andre, Shanks, and Meyer, “Thinking Ethically,” n.d., Markkula Center for Applied Ethics, Santa Clara University, https://www.scu.edu/ethics/ethics-resources/ethical-decision-making/thinking-ethically/

  8. 8.

    National Highway Transportation Board, “Preliminary Report Highway HWY18MH010,” n.d., 1, https://www.ntsb.gov/investigations/AccidentReports/Reports/HWY18MH010-prelim.pdf

  9. 9.

    Ibid., 2.

  10. 10.

    Ibid., 3.

  11. 11.

    Ibid., 3.

  12. 12.

    Tempe Police, “Tempe Police Vehicular Crimes Unit Is Actively Investigating the Details of This Incident That Occurred on March 18th. We Will Provide Updated Information Regarding the Investigation Once It Is Available. Twitter.Com/2dVP72TziQ,” Tweet, @TempePolice, March 21, 2018, https://twitter.com/TempePolice/status/976585098542833664

  13. 13.

    “Backup Driver in Fatal Self-Driving Uber Crash Was Streaming Hulu –The Washington Post,” accessed January 26, 2019, https://www.washingtonpost.com/news/dr-gridlock/wp/2018/06/22/uber-safety-drivers-phone-was-streaming-the-voice-ahead-of-deadly-driverless-crash-police-find/?utm_term=.1a8551e68ae6

  14. 14.

    Zaveri, “Prosecutors Don’t Plan to Charge Uber in Self-Driving Car’s Fatal Accident,” New York Times, March 5, 2019, https://www.nytimes.com/2019/03/05/technology/uber-self-driving-car-arizona.html and Randazzo, “Uber reaches settlement with family of woman killed by self-driving car,” The Republic/azcentral.com, March 29, 2018. https://www.azcentral.com/story/news/local/tempe/2018/03/29/uber-settlement-self-driving-car-death-arizona/469278002/

  15. 15.

    Elish, “Moral Crumple Zones.”

  16. 16.

    Ibid., 2.

  17. 17.

    Ibid., 3.

  18. 18.

    Ibid., 16.

  19. 19.

    Ibid., 13–14, 19.

  20. 20.

    Ibid., 3.

  21. 21.

    JafariNaimi, “Our Bodies in the Trolley’s Path, or Why Self-Driving Cars Must *Not* Be Programmed to Kill.”

  22. 22.

    Uber Advanced Technologies Group, Uber Self-Driving Cars Return to Pittsburgh Roads in Self-Driving, accessed January 26, 2019, https://www.youtube.com/watch?v=0E5IQJj_oKY. https://www.youtube.com/watch?v=0E5IQJj_oKY. “Uber Advanced Technologies Group,” Uber Advanced Technologies Group, accessed January 26, 2019, https://www.uber.com/info/atg/

  23. 23.

    In July 2018, Uber laid off 100 workers in the wake of the crash, announcing that they could either take severance or apply for 55 new positions of mission specialist. The workers had been on leave since the crash in March 2018. The cut was another in a series of layoffs from an original pool of 400 autonomous vehicle operators in five US cities. Laura Bliss, “Uber Just Dramatically Scaled Back Autonomous Car Testing,” CityLab, accessed January 26, 2019, https://www.citylab.com/transportation/2018/07/uber-just-fired-its-pittsburgh-av-drivers/564947/

  24. 24.

    Uber Advanced Technologies Group, Uber Self-Driving Cars Return to Pittsburgh Roads in Self-Driving.

  25. 25.

    Ibid.

  26. 26.

    Parasuraman, Sheridan, and Wickens, “A Model for Types and Levels of Human Interaction with Automation.”

  27. 27.

    Ibid., 292.

  28. 28.

    Ibid.

  29. 29.

    Ibid.

  30. 30.

    Rittel and Webber, “Dilemmas in a General Theory of Planning,” 161.

  31. 31.

    Ibid., 162–63.

  32. 32.

    Ibid., 162, 163.

  33. 33.

    Ibid., 165.

  34. 34.

    For example, see Markoff, “Robot Cars Cannot Count on Us in an Emergency,” The New York Times, December 22, 2017, https://www.nytimes.com/2017/06/07/technology/google-self-driving-cars-handoff-problem.html. See also “Top Robotics Expert on Uber Crash Questions Whether Sensors Worked,” accessed January 26, 2019, https://www.usatoday.com/story/tech/2018/03/23/top-robotics-expert-uber-crash-questions-whether-sensors-worked/451420002/

  35. 35.

    Edwards, The Closed World, 64. Closed world @64. The agency was known as ARPA from 1958–1972 and 1993–1996. The 1993 name change to ARPA was supported by an interest in commercial development over military development, supported by President Bill Clinton and then Deputy Defense Secretary William Perry. “DARPA Sixty Years,” accessed January 26, 2019, https://www.darpa.mil/Timeline/index.html

  36. 36.

    “Project Maven Industry Day Pursues Artificial Intelligence for DoD Challenges,” US Department of Defense, accessed January 26, 2019, https://dod.defense.gov/News/Article/Article/1356172/project-maven-industry-day-pursues-artificial-intelligence-for-dod-challenges/

  37. 37.

    Vincent, “Google Is Using Its AI Skills to Help the Pentagon Learn to Analyze Drone Footage,” The Verge, March 6, 2018, https://www.theverge.com/2018/3/6/17086276/google-ai-military-drone-analysis-pentagon-project-maven-tensorfow

  38. 38.

    Image published in Tom Simonite, “Google Sets Limits on Its Use of AI, but Allows Defense Work,” Wired, June 7, 2018, https://www.wired.com/story/google-sets-limits-on-its-use-of-ai-but-allows-defense-work/

  39. 39.

    “‘The Business of War’: Google Employees Protest Work for the Pentagon,” accessed January 26, 2019, https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html. Letter published at https://static01.nyt.com/files/2018/technology/googleletter.pdf

  40. 40.

    “Don’t be Evil” was in the prospectus of Google’s initial public offering in 2004, which states, “Don’t be evil. We believe strongly that in the long-term, we will be better served—as shareholders and in all other ways—by a company that does good things for the world even if we forgo some short-term gains. This is an important aspect of our culture and is broadly shared within the company. Google users trust our systems to help them with important decisions: medical, financial, and many others. Our search results are the best we know how to produce. They are unbiased and objective and we do not accept payment for them or for inclusion or more frequent updating. We also display advertising, which we work hard to make relevant, and we label it clearly. This is similar to a well-run newspaper, where the advertisements are clear and the articles are not influenced by the advertisers’ payments. We believe it is important for everyone to have access to the best information and research, not only to the information people pay for you to see.” “Amendment No. 9 to Form S-1,” accessed January 26, 2019, https://www.sec.gov/Archives/edgar/data/1288776/000119312504142742/ds1a.htm#toc59330_1

  41. 41.

    “Open Letter in Support of Google Employees and Tech Workers,” ICRAC (blog), May 1, 2018, https://www.icrac.net/open-letter-in-support-of-google-employees-and-tech-workers/ in “The Line Between Big Tech and Defense Work | WIRED,” accessed January 26, 2019, https://www.wired.com/story/the-line-between-big-tech-and-defense-work/

  42. 42.

    “Google’s Decision to Ditch Project Maven Is a Grave Error – Bloomberg,” accessed January 26, 2019, https://www.bloomberg.com/opinion/articles/2018-06-06/google-s-decision-to-ditch-project-maven-is-a-grave-error

  43. 43.

    Conger, “Google Removes ‘Don’t Be Evil’ Clause From Its Code of Conduct,” Gizmodo, accessed January 26, 2019, https://gizmodo.com/google-removes-nearly-all-mentions-of-dont-be-evil-from-1826153393

  44. 44.

    Employees of Microsoft, “An Open Letter to Microsoft: Don’t Bid on the US Military’s Project JEDI,” Medium, October 12, 2018, https://medium.com/s/story/an-open-letter-to-microsoft-dont-bid-on-the-us-military-s-project-jedi-7279338b7132

  45. 45.

    “Technology and the US Military,” Microsoft on the Issues, October 26, 2018, https://blogs.microsoft.com/on-the-issues/2018/10/26/technology-and-the-us-military/

  46. 46.

    Conger, Sanger and Shane, “Microsoft Wins Pentagon’s $10Billion JEDI Contract, Thwarting Amazon,” New York Times, October 25, 2019, accessed October 28, 2019, https://www.nytimes.com/2019/10/25/technology/dod-jedi-contract.html

  47. 47.

    “Announcing the New AWS Secret Region,” Amazon Web Services, November 20, 2017, https://aws.amazon.com/blogs/publicsector/announcing-the-new-aws-secret-region/

  48. 48.

    “Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots,” American Civil Liberties Union, accessed January 26, 2019, https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28

  49. 49.

    Ibid.

  50. 50.

    An Amazon Employee, “I’m an Amazon Employee. My Company Shouldn’t Sell Facial Recognition Tech to Police.,” Medium, October 16, 2018, https://medium.com/s/powertrip/im-an-amazon-employee-my-company-shouldn-t-sell-facial-recognition-tech-to-police-36b5fde934ac and Trevor Timm, “‘Shock, Anger, Disappointment’: An Amazon Employee Speaks Out,” Medium, October 16, 2018, https://medium.com/s/oversight/shock-anger-disappointment-an-amazon-employee-speaks-out-88d927792950

  51. 51.

    Castellanos, “Pentagon Signs $885 Million Artificial Intelligence Contract with Booz Allen,” WSJ (blog), July 30, 2018, https://blogs.wsj.com/cio/2018/07/30/pentagon-signs-885-million-artificial-intelligence-contract-with-booz-allen/

  52. 52.

    Baxter compiled a list of more 20 separate tools and frameworks in October 2018. She noted in a personal communication that another ethics and AI researcher told her there were now more than 200. Kathy Baxter, personal communication, October 12, 2018.

  53. 53.

    “AI at Google: Our Principles,” Google, June 7, 2018, https://www.blog.google/technology/ai/ai-principles/

  54. 54.

    Statt, “Google Pledges Not to Develop AI Weapons, but Says It Will Still Work with the Military,” The Verge, June 7, 2018, https://www.theverge.com/2018/6/7/17439310/google-ai-ethics-principles-warfare-weapons-military-project-maven

  55. 55.

    Suchman, “Corporate Accountability,” Robot Futures (blog), June 11, 2018, https://robotfutures.wordpress.com/2018/06/10/corporate-accountability/

  56. 56.

    Wagner, “Ethics as an Escape from Regulation: From Ethics-Washing to Ethics-Shopping?”

  57. 57.

    Clark, “Chief Ethics Officers: Who Needs Them?,” Forbes, October 23, 2006, https://www.forbes.com/2006/10/23/leadership-ethics-hp-lead-govern-cx_hc_1023ethics.html#49bc46a25182. Thank you to Kathy Baxter for this insight, originally encountered Kathy Baxter, “How to Build Ethics into AI — Part I,” Salesforce UX (blog), March 27, 2018, https://medium.com/salesforce-ux/how-to-build-ethics-into-ai-part-i-bf35494cce9

  58. 58.

    Ibid., 5–6.

  59. 59.

    Friedman, Kahn, Jr., and Borning, “Value Sensitive Design and Information Systems,” in Human-Computer Interaction in Management Information Systems: Foundations,1. See also Friedman and Hendry, Value Sensitive Design.

  60. 60.

    Friedman, Kahn, Jr., and Borning, 2.

  61. 61.

    Ibid., 3–4.

  62. 62.

    Guszcza, “Why Artificial Intelligence Needs Human-Centric Design | Deloitte Insights,” Deloitte Insights, January 22, 2018, https://www2.deloitte.com/insights/us/en/deloitte-review/issue-22/artificial-intelligence-human-centric-design.html

  63. 63.

    Ibid.

  64. 64.

    “Trolley Problem Memes – Posts,” accessed January 26, 2019, https://www.facebook.com/TrolleyProblemMemes/photos/a.250373635311569/670447403304188/?type=3&theater

  65. 65.

    Masicampo, A Two-Year-Old’s Solution to theTrolley Problem, accessed October 28, 2019, https://www.youtube.com/watch?v=-N_RZJUAQY4

  66. 66.

    Roose, “The Hidden Automation Agenda of the Davos Elite,” TheNew York Times, January 26, 2019, sec. Technology, https://www.nytimes.com/2019/01/25/technology/automation-davos-world-economic-forum.html

References

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Steenson, M.W. (2021). AI, Ethics, and Design: Revisiting the Trolley Problem. In: Ward, S.J.A. (eds) Handbook of Global Media Ethics. Springer, Cham. https://doi.org/10.1007/978-3-319-32103-5_26

Download citation

Publish with us

Policies and ethics