Does AI Debias Recruitment? Race, Gender, and AI’s “Eradication of Difference”

Philosophy and Technology 35 (4):1-25 (2022)
  Copy   BIBTEX

Abstract

In this paper, we analyze two key claims offered by recruitment AI companies in relation to the development and deployment of AI-powered HR tools: (1) recruitment AI can objectively assess candidates by removing gender and race from their systems, and (2) this removal of gender and race will make recruitment fairer, help customers attain their DEI goals, and lay the foundations for a truly meritocratic culture to thrive within an organization. We argue that these claims are misleading for four reasons: First, attempts to “strip” gender and race from AI systems often misunderstand what gender and race are, casting them as isolatable attributes rather than broader systems of power. Second, the attempted outsourcing of “diversity work” to AI-powered hiring tools may unintentionally entrench cultures of inequality and discrimination by failing to address the systemic problems within organizations. Third, AI hiring tools’ supposedly neutral assessment of candidates’ traits belie the power relationship between the observer and the observed. Specifically, the racialized history of character analysis and its associated processes of classification and categorization play into longer histories of taxonomical sorting and reflect the current demands and desires of the job market, even when not explicitly conducted along the lines of gender and race. Fourth, recruitment AI tools help produce the “ideal candidate” that they supposedly identify through by constructing associations between words and people’s bodies. From these four conclusions outlined above, we offer three key recommendations to AI HR firms, their customers, and policy makers going forward.

Other Versions

No versions found

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 103,486

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

The Whiteness of AI.Stephen Cave & Kanta Dihal - 2020 - Philosophy and Technology 33 (4):685-703.
More than Skin Deep: a Response to “The Whiteness of AI”.Shelley Park - 2021 - Philosophy and Technology 34 (4):1961-1966.

Analytics

Added to PP
2022-10-11

Downloads
102 (#213,682)

6 months
14 (#181,413)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

References found in this work

Data feminism.Catherine D'Ignazio - 2020 - Cambridge, Massachusetts: The MIT Press. Edited by Lauren F. Klein.
The Whiteness of AI.Stephen Cave & Kanta Dihal - 2020 - Philosophy and Technology 33 (4):685-703.
Regarding the pain of others.Susan Sontag - 2003 - Diogène 201 (1):127-.

View all 20 references / Add more references