Synthese (1-2):1-24 (2020)

Mark Alfano
Macquarie University
J. Adam Carter
University of Glasgow
YouTube has been implicated in the transformation of users into extremists and conspiracy theorists. The alleged mechanism for this radicalizing process is YouTube’s recommender system, which is optimized to amplify and promote clips that users are likely to watch through to the end. YouTube optimizes for watch-through for economic reasons: people who watch a video through to the end are likely to then watch the next recommended video as well, which means that more advertisements can be served to them. This is a seemingly innocuous design choice, but it has a troubling side-effect. Critics of YouTube have alleged that the recommender system tends to recommend extremist content and conspiracy theories, as such videos are especially likely to capture and keep users’ attention. To date, the problem of radicalization via the YouTube recommender system has been a matter of speculation. The current study represents the first systematic, pre-registered attempt to establish whether and to what extent the recommender system tends to promote such content. We begin by contextualizing our study in the framework of technological seduction. Next, we explain our methodology. After that, we present our results, which are consistent with the radicalization hypothesis. Finally, we discuss our findings, as well as directions for future research and recommendations for users, industry, and policy-makers.
Keywords technological seduction  radicalization  transformative experience  YouTube  recommender system  conspiracy theory
Categories (categorize this paper)
DOI 10.1007/s11229-020-02724-x
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Translate to english
Revision history

Download options

PhilArchive copy

 PhilArchive page | Other versions
External links

Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library

References found in this work BETA

Echo Chambers and Epistemic Bubbles.C. Thi Nguyen - 2020 - Episteme 17 (2):141-161.
Transformative Experience.Laurie Ann Paul - 2014 - Oxford University Press.

View all 34 references / Add more references

Citations of this work BETA

Fake News and Epistemic Vice: Combating a Uniquely Noxious Market.Megan Fritts & Frank Cabrera - forthcoming - Journal of the American Philosophical Association:1-22.
What’s Wrong with Automated Influence.Claire Benn & Seth Lazar - forthcoming - Canadian Journal of Philosophy:1-24.

View all 8 citations / Add more citations

Similar books and articles

Clustering Algorithms in Hybrid Recommender System on MovieLens Data.Urszula Kuzelewska - 2014 - Studies in Logic, Grammar and Rhetoric 37 (1):125-139.
A Multi-Agent Legal Recommender System.Lucas Drumond & Rosario Girardi - 2008 - Artificial Intelligence and Law 16 (2):175-207.
Technological Seduction and Self-Radicalization.Mark Alfano, Joseph Adam Carter & Marc Cheong - 2018 - Journal of the American Philosophical Association (3):298-322.
A New Authenticity? Communicative Practices on YouTube.Andrew Tolson - 2010 - Critical Discourse Studies 7 (4):277-289.
KevJumba and the Adolescence of YouTube.Roger Saul - 2010 - Educational Studies: A Jrnl of the American Educ. Studies Assoc 46 (5):457-477.
Development of a Recommender System Based on Personal History.Katsuaki Tanaka, Koichi Hori & Masato Yamamoto - 2008 - Transactions of the Japanese Society for Artificial Intelligence 23 (6):412-423.


Added to PP index

Total views
942 ( #6,896 of 2,506,489 )

Recent downloads (6 months)
160 ( #3,814 of 2,506,489 )

How can I increase my downloads?


My notes