Algorithmic Microaggressions

Feminist Philosophy Quarterly 8 (3) (2022)
  Copy   BIBTEX

Abstract

We argue that machine learning algorithms can inflict microaggressions on members of marginalized groups and that recognizing these harms as instances of microaggressions is key to effectively addressing the problem. The concept of microaggression is also illuminated by being studied in algorithmic contexts. We contribute to the microaggression literature by expanding the category of environmental microaggressions and highlighting the unique issues of moral responsibility that arise when we focus on this category. We theorize two kinds of algorithmic microaggression, stereotyping and erasure microaggressions, and argue that corporations are responsible for the microaggressions their algorithms create. As a case study, we look at the problems faced by Google’s autocomplete prediction and at the insufficiency of their solutions. The case study of autocomplete demonstrates our core claim that microaggressions constitute a distinct form of algorithmic bias and that identifying them as such allows us to avoid seeming solutions that recreate the same kinds of harms. Google has a responsibility to make information freely available, without exposing users to degradation. To fulfill its duties to marginalized groups, Google must abandon the fiction of neutral prediction and instead embrace the liberatory power of suggestion.

Analytics

Added to PP
2022-12-25

Downloads
382 (#55,169)

6 months
222 (#14,196)

Historical graph of downloads
How can I increase my downloads?

Author Profiles

Benjamin Wald
University of Toronto, St. George Campus
Emma McClure
Saint Mary's University

Citations of this work

No citations found.

Add more citations

References found in this work

Data feminism.Catherine D'Ignazio - 2020 - Cambridge, Massachusetts: The MIT Press. Edited by Lauren F. Klein.
White Ignorance.Charles Mills - 2007 - In Shannon Sullivan & Nancy Tuana (eds.), Race and Epistemologies of Ignorance. State Univ of New York Pr. pp. 11-38.
Robots, Law and the Retribution Gap.John Danaher - 2016 - Ethics and Information Technology 18 (4):299–309.

View all 12 references / Add more references