Skip to main content
Log in

Workplace automation and political replacement: a valid analogy?

  • Original Research
  • Published:
AI and Ethics Aims and scope Submit manuscript

Abstract

A great deal of theorizing has emerged about the economic ramifications of increased automation. However, significantly less attention has been paid to the potential effects of AI-driven occupational replacement on less measurable metrics—in particular, what it feels like to be replaced. In politics, we see examples of nation-states and extremist groups invoking the concept of replacement as a motivator for political action, unrest, and, at times, violence. In the realm of workplace automation, and in particular, in the case of AI-driven workplace automation, the replacement of human labor with artificial labor is an explicit goal. In this paper, we suggest that, given the effects that the experience of a sense of replacement has in political contexts and the potential for that sense of replacement to motivate unrest and violence, we should be concerned about the widely predicted replacement of workers over the coming decades beyond the potential economic challenges which may arise.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Renaud Camus is a French writer, political activist, and nationalist. In the 1970s and 80s, he was a member of France’s Socialist party and wrote a few successful novels which were influential in LGBTQ communities. By the late 90s and early 2000s, Camus had become a radical nationalist and began expounding his Great Replacement conspiracy theory.

  2. For a detailed discussion of these components, see Eisikovits, forthcoming.

  3. Note that we are using the term “white nationalist” here. The “white supremiscist” position would add some inherent inferiority of non-white ethnicities and consequent superiority of their own ethnicity. While in practice, these ideologies tend to be advocated in tandem, we are focusing here on the former.

  4. The literature on workers attitudes towards workplace automation includes texts such as Ivanov et al. [20], Nam [28], and Brougham and Haar [5].

  5. Granulo et al. [17] is an exception to this general dearth of interest in exploring the internal experiences of workers replaced by automation.

  6. Many commentators on the ethical risks of AI systems worry about algorithmic bias—the ways in which algorithms perpetuate and reify existing social biases. See for example O’Neil [29], Danks and London [8], and Eubanks [13]. While these concerns are certainly merited and important, algorithmic bias can, in principle, be addressed if the political will and commercial motivation exist to diversify the coders and quality control the models they create. And, in fact, other commentators argue that machines have the potential to operate with greater equity than human judgment can. See for example Eisikovits and Feldman [10].

  7. Consider, for example, the loss of socialization opportunities or the stresses that accompany the loss of income. Additionally, these factors are further complicated by considerations such as gender role, familial role, age, social support, and more [2, 36].

  8. However, even if the above hypothesis is true, it is still possible that sufficiently wide scale automation of low skill, low wage may still lead to social unrest if automation quickly and significantly reduced the availability of such positions. Having a job, providing for oneself or one’s family, being useful—at a more general level, these elements contribute to a sense of self-worth and their absence would reasonably contribute to social unrest and backlash.

  9. It is beyond the scope of the current discussion to build-out how such a shift (from a loosely organized collection of individuals to an identity group) occurs, both in theory and in practice. Social identity theory may be a fruitful means of framing this shift from occupational replacement to political replacement by examining how individuals self-categorize into social groups.

  10. For an important recent example of this approach, see Case and Deaton [6].

  11. The concept and literature of dehumanization may be another interesting way to frame this feeling of ‘replaceability’. However, much of the literature about dehumanization approaches the concept from an objective normative position, i.e., it seeks to tell us under what conditions we ought to consider someone to have been dehumanized and, by extension, what dehumanization consists of [18, 31]. However, the question of what dehumanization feels like from the first-hand perspective and what reactions this might tend to produce is a less thoroughly explored topic and warrants further investigation.

References

  1. Agrawal, A., Gans, J., Golfarb, A.: The economics of artificial intelligence: an agenda. NBER Reporter. (2019)

  2. Alvaro, L., Garrido, A., Pereira, C.R., Torres, A.R., Barros, S.C.: Unemployment, self-esteem, and depression: differences between men and women. Span. J. Psychol. (2019). https://doi.org/10.1017/sjp.2018.68

    Article  Google Scholar 

  3. Anderson, E.C.: China restored: the middle kingdom looks to 2020 and beyond. Praeger, Santa Barabara (2010)

    Book  Google Scholar 

  4. Bobo, L.: Russia and the new world disorder. Brookings Institution Press with Chatham House, Washington DC (2015). https://doi.org/10.7864/j.ctt6wpccc

    Book  Google Scholar 

  5. Brougham, D., Haar, J.: Smart technology, artificial intelligence, robotics, and algorithms (STARA): employees’ perceptions of our future workplace. J. Manag. Organ. (2018). https://doi.org/10.1017/jmo.2016.55

    Article  Google Scholar 

  6. Case, A., Deaton, A.: Deaths of despair and the future of capitalism. Princeton University Press, Princeton (2020)

    Book  Google Scholar 

  7. Chui, M., Manyika J., Miremadi, M.: Where machines could replace humans—and where they can’t (yet). Mckinsey Global Institute. http://dln.jaipuria.ac.in:8080/jspui/bitstream/123456789/2951/1/Where-machines-could-replace-humans-and-where-they-cant-yet.pdf (2016). Accessed 19 Nov 2022

  8. Danks, D., London, A.: Algorithmic bias in autonomous systems. Proc. Twenty-Sixth Int. Jt. Conf. Artif. Intell. (2017). https://doi.org/10.24963/ijcai.2017/654

    Article  Google Scholar 

  9. Dirican, C.: The impacts of robotics, artificial intelligence on business and economics. Proced. Soc. Behav. Sci. (2015). https://doi.org/10.1016/j.sbspro.2015.06.134

    Article  Google Scholar 

  10. Eisikovits, N., Feldman, D.: AI and phronesis. Moral Philos. Polit. (2021). https://doi.org/10.1515/mopp-2021-0026

    Article  Google Scholar 

  11. Eisikovits, N.: ISIS, humiliation and political philosophy. J. Mil. Ethics (2022)

  12. Eisikovits, N.: Political humiliation and the sense of replacement. In: Parsons, W. (ed.) How to end a war. Cambridge University Press, Cambridge (2022)

    Google Scholar 

  13. Eubanks, V.: Automating inequality: how high-tech tools profile. Police and Punish the Poor. St. Martin’s Press, New York (2018)

    Google Scholar 

  14. Fawaz, G.: ISIS : a history. Princeton University Press, Oxford (2016)

    Google Scholar 

  15. Geiger, A. W.: How americans see automation and the workplace in 7 charts. Pew Research Center. https://www.pewresearch.org/fact-tank/2019/04/08/how-americans-see-automation-and-the-workplace-in-7-charts/?fbclid=IwAR3I_O3vDrbLm-znCIM1TcQZeonH1uxJQDFi4vIXuTzXzjeOzBglh3PRSJI (2020). Accessed 20 Nov 2022

  16. Grace, K., Salvatier, J., Dafoe, A., Zhang, B., Evans, O.: Viewpoint: when will AI exceed human performance? Evidence from AI experts. J. Artif. Intell. Res. (2018). https://doi.org/10.1613/jair.1.11222

    Article  MathSciNet  Google Scholar 

  17. Granulo, F.C., Puntoni, S.: Psychological reactions to human versus robotic job replacement. Nat. Hum. Behav. (2019). https://doi.org/10.1038/s41562-019-0670-

    Article  Google Scholar 

  18. Haslam, N.: Dehumanization: an integrative review. Pers. Soc. Psychol. Rev. (2006). https://doi.org/10.1207/s15327957pspr1003_4

    Article  Google Scholar 

  19. Hill, F.: There is nothing for you here : finding opportunity in the twenty-first century. Mariner Books, Boston (2021)

    Google Scholar 

  20. Ivanov, S., Kuyumdzhiev, M., Webster, C.: Automation fears: drivers and solutions. Technol. Soc. (2020). https://doi.org/10.1016/j.techsoc.2020.101431

    Article  Google Scholar 

  21. Keefe, T.: The stresses of unemployment. Soc. Work (1984). https://doi.org/10.1093/sw/29.3.264

    Article  Google Scholar 

  22. Kristof, N.D., WuDunn, S.: Tightrope: Americans reaching for hope. Alfred A. Knopf, New York (2020)

    Google Scholar 

  23. Lago, C.: Covid-19 has exacerbated automation anxiety but fear of machines is nothing new. Tech Monitor. https://techmonitor.ai/technology/ai-and-automation/covid-19-and-automation-anxiety (2021). Accessed 20 November 2022

  24. Manyika, J., Lund, S., Chui, M., Bughin, J., Woetzel, J., Batra, P., Ko, R., Sanghvi, S.: Jobs lost, jobs gained: workforce transitions in a time of automation. McKinsey Institute, Washington, DC (2017)

    Google Scholar 

  25. Martela, G.M., Unanue, W., Araya, S., Bravo, D., Espejo, A.: What makes work meaningful? Longitudinal evidence for the importance of autonomy and beneficence for meaningful work. J. Vocat. Behav. (2021). https://doi.org/10.1016/j.jvb.2021.103631

    Article  Google Scholar 

  26. McKee, R., Song, Z., Wanberg, C.R., Kinicki, A.J.: Psychological and physical well-being during unemployment. J. Appl. Psychol. (2005). https://doi.org/10.1037/0021-9010.90.1.53

    Article  Google Scholar 

  27. Miscenko, D., David, D.V.: Identity and identification at work. Organ. Psychol. Rev. (2016). https://doi.org/10.1177/2041386615584009

    Article  Google Scholar 

  28. Nam, T.: Citizen attitudes about job replacement by robotic automation. Futures (2019). https://doi.org/10.1016/j.futures.2019.04.005

    Article  Google Scholar 

  29. O’Neil, C.: Weapons of math destruction. The Crown Publishing Group, New York, New York (2016)

    MATH  Google Scholar 

  30. Petriglieri, J.L.: Under threat: responses to and the consequences of threats to individuals’ identities. Acad. Manag. Rev. (2011). https://doi.org/10.5465/AMR.2011.65554645

    Article  Google Scholar 

  31. Smith, D.L.: On inhumanity: dehumanization and how to resist it. Oxford University Press, New York (2020)

    Book  Google Scholar 

  32. Szczepanski, M.: Economic impacts of artificial intelligence (AI). EPRS: European Parliamentary Research Service. https://policycommons.net/artifacts/1334867/economic-impacts-of-artificial-intelligence-ai/1940719/ (2019). Accessed 18 Nov 2022

  33. Vance, J.D.: Hillbilly elegy: a memoir of a family and culture in crisis. Harper, New York (2018)

    Google Scholar 

  34. Wakabayashi, D.: Meet the people who train the robots (to do their own jobs). The New York Times https://www.nytimes.com/2017/04/28/technology/meet-the-people-who-train-the-robots-to-do-their-own-jobs.html (2022).Accessed 1 July 2022

  35. Walker, S.: The long hangover : Putin’s New Russia and the ghosts of the past. Oxford University Press, New York (2018)

    Google Scholar 

  36. Walley, C.: Robots as Symbols and Anxiety Over Work Loss. MIT Work of the Future. https://workofthefuture.mit.edu/research-post/robots-as-symbols-and-anxiety-over-work-loss/#:~:text=While%20robots%20might%20be%20associated,changing%20nature%20of%20contemporary%20work (2020). Accessed 17 Jan 2022.

  37. Wildman, S.: “You will not replace us”: a French philosopher explains the Charlottesville chant. Vox. https://www.vox.com/world/2017/8/15/16141456/renaud-camus-the-great-replacement-you-will-not-replace-us-charlottesville-white (2017). Accessed 3 Aug 2020

  38. Williams, T.: The French origins of “You Will Not Replace Us”. The New Yorker. https://www.newyorker.com/magazine/2017/12/04/the-french-origins-of-you-will-not-replace-us (2017). Accessed 3 Aug 2020

  39. Waters, L.E., Moore, K.A.: Predicting self-esteem during unemployment: the effect of gender, financial deprivation, alternate roles, and social support. J. Employ. Couns. (2002). https://doi.org/10.1002/j.2161-1920.2002.tb00848.x

    Article  Google Scholar 

  40. Zheng, W.: Never forget national humiliation : historical memory in Chinese politics and foreign relations. Columbia University Press, New York (2012)

    Google Scholar 

  41. Zicheng, Y., Liu, G., Levine, S.I.: Inside China’s Grand Strategy: The perspective from the People’s Republic. The University Press of Kentucky, Lexington (2010)

    Book  Google Scholar 

  42. Zuelke, L.T., Schroeter, M.L., Witte, A.V., Hinz, A., Engel, C., Enzenbach, C., Zachariae, S., Loeffler, M., Thiery, J., Villringer, A., Riedel-Heller, S.G.: The association between unemployment and depression-Results from the population-based LIFE-adult-study. J. Affect. Disord. (2018). https://doi.org/10.1016/j.jad.2018.04.073

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jake Burley.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Burley, J., Eisikovits, N. Workplace automation and political replacement: a valid analogy?. AI Ethics 3, 1361–1370 (2023). https://doi.org/10.1007/s43681-022-00245-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s43681-022-00245-6

Keywords

Navigation