Skip to main content
Log in

Data orientalism: on the algorithmic construction of the non-Western other

  • Published:
Theory and Society Aims and scope Submit manuscript

Abstract

Research on algorithms tends to focus on American companies and on the effects their algorithms have on Western users, while such algorithms are in fact developed in various geographical locations and used in highly diverse socio-cultural contexts. That is, the spatial trajectories through which algorithms operate and the distances and differences between the people who develop such algorithms and the users their algorithms affect remain overlooked. Moreover, while the power of big data algorithms has been recently compared to colonialism (Couldry and Mejias 2019), the move from the colonial gaze (Yegenoglu 1998) to the algorithmic gaze (Graham 2010) has yet to be fully discussed. This article aims to fill these gaps by exploring the attempts to algorithmically conceptualize “the Other” . Based on the case study of an Israeli user-profiling company and its attempts to sell its services to East Asian corporations, I show that the algorithmic gaze—algorithms’ ability to characterize, conceptualize, and affect users—stems from a complex combination between opposing-but-complimentary perspectives: that it is simultaneously a continuation of the colonial gaze and its complete opposite. The ways in which algorithms are being programmed to see the Other, the ways algorithmic categories are named to depict the Other, and the ways people who design such algorithms describe and understand the Other are all different but deeply interrelated factors in how algorithms “see.” I accordingly argue that the story of algorithms is an intercultural one, and that the power of algorithms perpetually flows back and forth—between East and West, South and North.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Beer recently described the “data gaze” as the ways data analytics companies see and talk about data (Beer 2018). The “algorithmic gaze” I discuss here constitutes the ways in which such companies design, construct, and tweak their algorithms to better “see,” conceptualize, and influence people.

  2. As Ian Bogost suggested, algorithms’ power is not a material phenomenon so much as it is a devotional one—algorithms are idolized and are increasingly imagined as transcendental ideals (Bogost 2015). As Gillespie famously wrote: “That we are now turning to algorithms to identify what we need to know is as momentous as having relied on credentialed experts, the scientific method, common sense, or the word of God” (Gillespie 2014). Cooper similarly argued that like pastoral power, algorithmic governmentality is shepherding our attention and conduct via anticipatory tactics (Cooper 2020).

  3. This research was approved by the ethics committee of the Faculty of Social Sciences, The Hebrew University of Jerusalem.

  4. I used pseudonyms for all individuals and companies mentioned in this article.

  5. It is important to note that this case study is not designed to be representative of Israelis’ views on culture or race, nor of the views data analytics companies in general have on these issues. Keeping in mind that the strengths of qualitative, critical research come from understanding the “how” and “why” of socio-cultural phenomena, not their frequency of incidence (Small 2008), this article offers a description of the socio-technical mechanisms behind the expansion of algorithmic powers, not an assessment of the ethics (or racism) of such companies in general.

  6. The last decade has seen the creation of hundreds of Israeli data analytics companies (IVC 2020), and of thousands of such companies worldwide. The algorithmic products these companies produce are highly ubiquitous, as they operate behind the scenes of almost any online service—profiling users, personalizing contents, and nudging users towards different choices. Such companies often work with and rely upon much bigger global companies, and specifically, as shown below, such companies often rely on Facebook data, accessing it through Facebook’s API (See Kotliar 2020a).

  7. All Hebrew excerpts were translated by the author.

  8. Anthropologist Lila Abu-Lughod famously offered to reconsider the concept of culture, and has devised a strategy for “writing against culture”—a way of generating knowledge about people without over-generalizations and with a constant focus on the particular, the local, and the contextual (Abu-Lughod 2006). While there are many epistemological, methodological, and ethical differences between this view and the algorithmic view of people, they both seem to see Culture as a problematic category that should somehow be evaded. After all, while Extractive’s algorithms offer to move beyond categories like “Russian” or “Chinese,” Abu-Lughod criticizes the characterization of people as “the Nuer” or “the Balinese” (ibid. p. 475). Moreover, much like Abu-Lughod’s methodology, algorithms offer an allegedly more fine-grained, practice-based view of people that offers to replace cruder, top-down perspectives. As demonstrated below, the algorithmic gaze is simultaneously reminiscent of the one Abu-Lughod offers and its complete opposite.

  9. The categorization Facebook offers largely depends on users’ self-disclosure and on the fact that Facebook offers multi-lingual versions of their platform. Thus, users can create a Facebook group and categorize it as related to sports, and then posts in that group will automatically be classified as sports-related. Hence, companies like Extractive can learn about users’ actions and characteristics without reading, translating, or even accessing their posts.

  10. Deep Packet Inspection is a method of extracting, examining, and managing network data. It is often used to assess the functioning of a network, but it can also function as a powerful surveillance device, that can not only access the metadata of internet communication, but also the data itself (Fuchs 2013).

  11. While Israel has built itself a reputation as a “Start-up Nation” (Senor and Singer 2009) and while Israeli companies have had some prominent successes over the last decades, Israel is still a peripheral actor in the world map of technological innovation. As a relatively young country, with only 8.7 million citizens, Israel is almost 480 times smaller than the United States and it is located at the heart of the turbulent Middle East. The small user-base of Hebrew speakers, its problematic geo-political location, and its distance from major technological and economic centers often limit companies’ ability to “scale” into new markets. Accordingly, Israeli startups often choose to get bought by and consolidated into larger international companies rather than continuing to grow independently and locally.

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dan M. Kotliar.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kotliar, D.M. Data orientalism: on the algorithmic construction of the non-Western other. Theor Soc 49, 919–939 (2020). https://doi.org/10.1007/s11186-020-09404-2

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11186-020-09404-2

Keywords

Navigation