Skip to main content
Log in

The Authority to Moderate: Social Media Moderation and its Limits

  • Research Article
  • Published:
Philosophy & Technology Aims and scope Submit manuscript

Abstract

The negative impacts of social media have given rise to philosophical questions around whether social media companies have the authority to regulate user-generated content on their platforms. The most popular justification for that authority is to appeal to private ownership rights. Social media companies own their platforms, and their ownership comes with various rights that ground their authority to moderate user-generated content on their platforms. However, we argue that ownership rights can be limited when their exercise results in significant harms to others or the perpetration of injustices. We outline some of the substantive harms that social media platforms inflict through their practices of content moderation and some of the procedural injustices that arise through their arbitrary application of community guidelines. This provides a normative basis for calls to better regulate user-generated content on social media platforms. We conclude by considering some of the political and legal implications of our argument.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Data Availability

Not applicable.

Notes

  1. The term “censorship” has recently been co-opted by the American right and is now used to push back against the left’s socially progressive agenda, which the right unfavourably views as an attack on users’ speech rights online (Srinivasan, 2023). We use the term “censorship”, not in this way, but in the specific technical sense defined above that is common in the academic literature.

  2. Many moral issues regarding the exploitation of content moderation workers arise (see Roberts, 2019; and Barnes, 2022). It is also worth mentioning that some platforms, such as Reddit, give their users the authority to moderate content on forums (also called subreddits). These volunteer moderators are estimated to provide Reddit with $3.4 million worth of unpaid labour annually (Stokel-Walker, 2022), leading to demands for payment.

  3. The social media platform Gab may be an exception to this observation (see Stanford Internet Observatory, 2022).

References

  • Alfano, M., Carter, J. A., & Cheong, M. (2018). Technological seduction and self-radicalization. Journal of the American Philosophical Association, 4(3), 298–322.

    Article  Google Scholar 

  • Alfano, M., Fard, A. E., Carter, J. A., Clutton, P., & Klein, C. (2020). Technologically scaffolded atypical cognition: The case of youtube’s recommender system. Synthese, 199, 835–858.

    Article  Google Scholar 

  • Alfano, M. & Sullivan, E. (2022). Online Trust and Distrust. In Hannon and Ridder J.D. (Eds), The Routledge Handbook of Political Epistemology (pp. 480–491). Routledge.

  • Andrejevic, M. (2020). Automated Media. Routledge.

    Google Scholar 

  • Andrejevic, M., & Volvic, Z. (2020). From Mass to Automated Media. In N. Witzleb, M. Paterson, & J. Richardson (Eds.), Big Data, Political Campaigning and the Law (pp. 17–33). Routledge.

    Google Scholar 

  • Are, C. (2020). How Instagram’s Algorithm is Censoring Women and Vulnerable Users But Helping Online Abusers. Feminist Media Studies, 20(5), 741–744.

    Article  Google Scholar 

  • Balkin, J. M. (2022). To Reform Social Media, Reform Informational Capitalism. In L. Bollinger & G. R. Stone (Eds.), Social Media, Freedom of Speech and the Future of Our Democracy (pp. 233–254). Oxford Publishing Press.

    Chapter  Google Scholar 

  • Barberá, P. (2020). Social Media, Echo Chambers, and Political Polarization. In N. Persily & J. A. Tucker (Eds.), Social Media and Democracy: The State of the Field, Prospects for Reform (pp. 34–55). Cambridge University Press.

    Chapter  Google Scholar 

  • Barnes, R.M. (2022). Online extremism, AI, and (human) content moderation. Feminist Philosophy Quarterly, 8(3/4). Article 6.

  • Bejan, T. M. (2020). Free expression or equal speech? Social Philosophy and Policy Foundation, 37(2), 153–169.

    Article  Google Scholar 

  • Benkler, Y. (2022). Follow the Money, Back to Front. In L. Bollinger & G. R. Stone (Eds.), Social Media, Freedom of Speech and the Future of Our Democracy (pp. 255–272). Oxford Publishing Press.

    Chapter  Google Scholar 

  • Benkler, Y., Farris, R., & Roberts, H. (2018). Network Propaganda. Oxford Publishing Press.

    Book  Google Scholar 

  • Benkler, Y., Tilton, C., Etling, B., Roberts, H., Clark, J., Faris, R., Kaiser, J., and Schmitt, C. (2020). Mail-in Voter Fraud. Berkman Klein Center, 2020–6.

  • Bhagwat, A. (2019). Free Speech Categories in the Digital Age. In K. Gelber & S. J. Brison (Eds.), Free Speech in the Digital Age (pp. 88–103). Oxford University Press.

    Chapter  Google Scholar 

  • Bollinger, L., & Stone, G. R. (Eds.). (2022). Social Media. Oxford Publishing Press.

    Google Scholar 

  • Boxell, L., Gentzkow, M., and Shapiro, J.M. (2021). Cross-Country Trends in Affective Polarization. Review of Economics and Statistics, 1–61. https://doi.org/10.1162/rest_a_01160

  • Cho, J., Ahmed, S., Hilbert, M., Liu, B., & Luu, J. (2020). Do Search Algorithms Endanger Democracy? Journal of Broadcasting and Electronic Media, 64(2), 150–172.

    Article  Google Scholar 

  • Cingel, D.P., Carter, M.C., and Krause, H.V. (2022). Social Media and Self-Esteem. Current Opinion in Psychology, 45. https://doi.org/10.1016/j.copsyc.2022.101304.

  • Cobbe, J. (2021). Algorithmic Censorship by Social Platforms. Philosophy & Technology, 34(4), 739–766.

    Article  Google Scholar 

  • Cobbe, J., and Singh, J. (2019). Regulating recommending: motivations, considerations, and principles. European Journal of Law and Technology, 10(3).

  • Cohen, I. A., & Cohen, A. J. (2022). The Permissibility and Defensibility of Nonstate ‘Censorship.’ In J. P. Messina (Ed.), New Directions in the Ethics and Politics of Speech (pp. 13–31). Routledge.

    Chapter  Google Scholar 

  • Cohen, J., & Fung, A. (2021). Democracy and the Digital Public Sphere. In L. Bernholz, H. Landemore, & R. Reich (Eds.), Digital Technology and Democratic Theory (pp. 23–61). University of Chicago Press.

    Chapter  Google Scholar 

  • Cohen, J. (1993). Freedom of expression. Philosophy & Public Affairs, pp. 207–263.

  • Collins, B., Hoang, D. T., Nguyen, N. T., & Hwang, D. (2021). Trends in combating fake news on social media – a survey. Journal of Information and Telecommunication, 5(2), 247–266.

    Article  Google Scholar 

  • Cook, C. L., Patel, A., & Wohn, D. Y. (2021). Commercial versus volunteer. Frontiers in Human Dynamics, 3. https://doi.org/10.3389/fhumd.2021.626409

  • Davis, J. L., & Graham, T. (2021). Emotional Consequences and Attention Rewards. Information, Communication & Society, 24(5), 649–666.

    Article  Google Scholar 

  • Delgado, R., & Stefancic, J. (2018). Must we defend Nazis? New York University Press.

    Book  Google Scholar 

  • Digital Services Act, (2022). https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32022R2065&qid=1689750399256. Accessed 28 Jul 2023.

  • Douek, E. (2021). Governing Online Speech: From “Post-As-Trumps” to proportionality and probability. Columbia Law Review, 121(3), 759–834.

    Google Scholar 

  • Dwoskin, E., & Tiku, N. (2022). How twitter, on the front lines of history, finally decided to ban trump. Washington Posthttps://www.washingtonpost.com/technology/2021/01/16/how-twitter-banned-trump/. Accessed 20 Jul 2023.

  • Everett, C. M. (2018). Free speech on privately-owned fora. Kansas Journal of Law & Public Policy, 28(1), 113–145.

    Google Scholar 

  • Fabienne, P. (2017). Political Legitimacy. In Zalta, E.N.’s (Ed), The Stanford Encyclopedia of Philosophy.

  • Facebook Community Standards. (2023). Transparency Center. https://transparency.fb.com/en-gb/policies/community-standards/. Accessed 28 Jul 2023.

  • Flew, T., & Wilding, D. (2020). The Turn to Regulation in Digital Communications. Media, Culture, and Society, 43(1), 48–65.

    Article  Google Scholar 

  • Forestal, J. (2020). Constructing Digital Democracies. Political Studies, 69(1), 26–44.

    Article  Google Scholar 

  • Forestal, J. (2022). Designing for Democracy: How to Build Community in Digital Environments. Oxford University Press.

    Book  Google Scholar 

  • Formosa, P. (2017). Kantian Ethics. Cambridge University Press.

    Google Scholar 

  • Formosa, P., & Mackenzie, C. (2014). Nussbaum, Kant, and the capabilities approach to dignity. Ethical Theory and Moral Practice, 17(5), 875–892.

    Article  Google Scholar 

  • Gainsbury, S. M., Delfabbro, P., King, D. L., & Hing, N. (2016). An exploratory study of gambling operators’ use of social media and the latent messages conveyed. Journal of Gambling Studies, 32, 125–141.

    Article  Google Scholar 

  • Gillespie, T. (2018). Custodians of the Internet. Yale University Press.

    Google Scholar 

  • Gillespie, T. (2022). Do Not Recommend? Reduction As a Form of Content Moderation. Social Media+ Society, 8(3), 20563051221117550.

    Google Scholar 

  • Gillespie, T. (2020). Content moderation, AI, and the question of scale. Big Data & Society, 7(2). https://doi.org/10.1177/2053951720943234

  • Gorwa, R., & Guilbeault, D. (2020). Unpacking the Social Media Bot. Policy & Internet, 12(2), 225–248.

    Article  Google Scholar 

  • Gorwa, R., Binns, R., & Katzenbach, C. (2020). Algorithmic Content Moderation. Big Data & Society, 7(1), 2053951719897945.

    Article  Google Scholar 

  • Hao, K. (2021). How Facebook Got Addicted to Spreading Misinformation. Technology Review. https://www.technologyreview.com/2021/03/11/1020600/facebook-responsible-ai-misinformation/.  Accessed 1 Dec 2022.

  • Harriger, J. A., Thompson, J. K., & Tiggemann, M. (2023). TikTok, TikTok, the time is now: Future directions in social media and body image. Body Image, 44, 222–226.

    Article  Google Scholar 

  • Heinze, E. (2016). Hate Speech and Democratic Citizenship. Oxford University Press.

    Book  Google Scholar 

  • Hong, S., & Kim, S. H. (2016). Political polarization on twitter. Government Information Quarterly, 33(4), 777–782.

    Article  Google Scholar 

  • Honoré, M.A. (2013). Ownership. In Coleman, J.L.’s (Ed), Readings in the Philosophy of Law. Routledge.

  • Howard, J. W. (2019). Free speech and hate speech. Annual Review of Political Science, 22, 93–109.

    Article  Google Scholar 

  • Howard, J. W. (2021). Extreme Speech, Democratic Deliberation, and Social Media. In C. Véliz (Ed.), The Oxford Handbook of Digital Ethics. Oxford University Press.

    Google Scholar 

  • Hudders, L., De Jans, S., & De Veirman, M. (2021). The commercialization of social media stars. International Journal of Advertising, 40(3), 327–375.

    Article  Google Scholar 

  • Johnson, B. E., & Ho Youm, K. (2008). Commercial speech and free expression: The United States and Europe compared. Journal of International Media & Entertainment Law, 2(2), 159–198.

    Google Scholar 

  • Katz, L. (2008). Exclusion and exclusivity in property law. The University of Toronto Law Journal, 58(3), 275–315.

    Article  Google Scholar 

  • Keller, D. (2021). Amplification and its discontents. Journal of Free Speech Law, 1, 227–268.

    Google Scholar 

  • Klein, C., Clutton, P., & Polito, V. (2018). Topic modeling reveals distinct interests within an online conspiracy forum. Frontiers in Psychology, 9, 189.

    Article  Google Scholar 

  • Klein, C., Clutton, P., & Dunn, A. G. (2019). Pathways to conspiracy. Plos One, 14(11), 1–23.

    Article  Google Scholar 

  • Klein, E. (2023). The Teen Mental Health Crisis, Part 1 (Podcast). The Ezra Klein Show. Apple Podcasts.

  • Lazar, S. (2023). Communicative Justice and the Political Philosophy of Attention. https://hai.stanford.edu/events/tanner-lecture-ai-and-human-values-seth-lazar.  Accessed 28 Jul 2023.

  • Lewandowsky, S., Cook, J., Fay, N., & Gignac, G. E. (2019). Science by social media. Memory & Cognition, 47(8), 1445–1456.

    Article  Google Scholar 

  • Lim, M., & Ghadah, A. (2021). Beyond a technical bug. The Conversation. https://theconversation.com/beyond-a-technical-bug-biased-algorithms-and-moderation-are-censoring-activists-on-social-media-160669. Accessed 20 Jul 2023.

  • Llansó, E., Hoboken, J. V., Leerssen, P., & Harambah, J. (2020). Artificial intelligence, content moderation, and freedom of expression. Transatlantic Working Group on Content Moderation Online and Freedom of Expression. https://www.ivir.nl/publicaties/download/AI-Llanso-Van-Hoboken-Feb-2020.pdf. Accessed 20 Jul 2023.

  • Mackenzie, C., Rogers, W., & Dodds, S. (2014). Vulnerability: New Essays in Ethics and Feminist Philosophy. Oxford University Press.

    Google Scholar 

  • Magarian, P.G. (2021). The Internet and Social Media. In Stone, A. and Schauer (Eds), The Oxford Handbook of Freedom of Speech (pp. 350–368). Oxford University Press.

  • Merten, L. (2021). Block, hide or follow—personal news curation practices on social media. Digital Journalism, 9(8), 1018–1039.

    Article  Google Scholar 

  • Mill, S. J. (2001). On Liberty. Electric Book Co.

    Google Scholar 

  • Milmo, D. (2021). Facebook revelations. The Guardian. https://www.theguardian.com/technology/2021/oct/25/facebook-revelations-from-misinformation-to-mental-health. Accessed 30 Jul 2023.

  • Newman N., Fletcher, R., Kalogeropoulas, A., Levy, D.A.L., and Nielsen, R.K. (2017). Reuters Insitute Digital News Report 2017. Reuters Institute for the Study of Journalism.

  • Nguyen, C. T. (2018). Echo Chambers and Epistemic Bubbles. Episteme, 17(2), 141–161.

    Article  Google Scholar 

  • Nussbaum, M. C. (2000). Women and human development. Cambridge University Press.

    Book  Google Scholar 

  • Orben, A., & Przybylski, A. K. (2019). The association between adolescent well-being and digital technology use. Nature Human Behaviour, 3, 173–182.

    Article  Google Scholar 

  • Pettit, P. (2002). Republicanism: A Theory of Freedom and Government. Oxford University Press.

    Google Scholar 

  • Pettit, P. (2013). On the people’s terms. Cambridge University Press.

    Google Scholar 

  • Post, R. (2011). Participatory democracy and free speech. Virginia Law Review, 97(3), 477–489.

    Google Scholar 

  • Ranttila, K. (2020). Social media and monopoly. Ohio Northern University Law Review, 46, 161–179.

    Google Scholar 

  • Roberts, S. (2019). Behind the screen: Content moderation in the shadows of social media. Yale University Press.

    Book  Google Scholar 

  • Rozenshtein, A. Z. (2022). Moderating the fediverse. https://www.journaloffreespeechlaw.org/rozenshtein2.pdf. Accessed 20 Jul 2023.

  • Sahebi, S., & Formosa, P. (2022). Social media and its negative impacts on autonomy. Philosophy and Technology, 35(3), 1–24.

    Article  Google Scholar 

  • Savolainen, L. (2022). The shadow banning controversy. Media, Culture & Society, 44(6), 1091–1109.

    Article  Google Scholar 

  • Schiffer, Z., & Newton, C. (2023). Yes, Elon Musk Created a Special System for Showing You All His Tweets. The Verge. https://www.theverge.com/2023/2/14/23600358/elon-musk-tweets-algorithm-changes-twitter. Accessed 25 May 2023.

  • Schmidtz, D. (2010). Property and Justice. Social Philosophy and Policy, 27(1), 79–100.

    Article  Google Scholar 

  • Settle, J. E. (2018). Frenemies: How Social Media Polarizes America. Cambridge University Press.

    Book  Google Scholar 

  • Silver, L., & Huang, C. (2019). Emerging economies, smartphone and social media users have broader social networks. Pew Research Center. https://www.pewresearch.org/internet/2019/08/22/in-emerging-economies-smartphone-and-social-media-users-have-broader-social-networks/. Accessed 11 Oct 2023.

  • Srinivasan, A. (2023) Cancelled. London Review of Books, 45(13). https://www.lrb.co.uk/the-paper/v45/n13/amia-srinivasan/cancelled. Accessed 8 Oct 2023.

  • Srnicek, N. (2017). Platform Capitalism. Polity.

  • Stanford Internet Observatory. (2022). New report analyzes dynamics on alt-platform Gab. Cyber Policy Center. https://cyber.fsi.stanford.edu/io/news/sio-new-gab-report. Accessed 4 Oct 2023.

  • Stokel-Walker, C. (2022). Reddit Moderators do $3.4 million worth of unpaid work each other. New Scientist. https://www.newscientist.com/article/2325828-reddit-moderators-do-3-4-million-worth-of-unpaid-work-each-year/. Accessed 9 Oct 2023.

  • Suzor, N. (2019). Lawless. Cambridge University Press.

    Book  Google Scholar 

  • Terren, L., & Borge-Bravo, R. (2021). Echo Chambers on Social Media. Review of Communication Research, 9, 99–118.

    Article  Google Scholar 

  • Tuchman, A. E. (2019). Advertising and demand for addictive goods: The effects of e-cigarette advertising. Marketing Science, 38(6), 994–1022.

    Google Scholar 

  • Tufekci, Z. (2017). Twitter and Tear Gas. Yale University Press.

    Google Scholar 

  • Turner, P. N. (2014). “Harm” and Mill’s Harm Principle. Ethics, 124(2), 299–326.

    Article  Google Scholar 

  • Vaidhyanathan, S. (2018). Antisocial Media. Oxford University Press.

    Google Scholar 

  • Vuorre, M., & Przybylski, A. K. (2023). Estimating the association between Facebook adoption and well-being in 72 countries. Royal Society Open Science, 10, 221451.

    Article  Google Scholar 

  • Waldron, J. (2012). The Harm in Hate Speech. Harvard University Press.

    Book  Google Scholar 

  • West, S. M. (2018). Censored, suspended, shadowbanned. New Media & Society, 20(11), 4366–4388.

    Article  Google Scholar 

  • West, L.J. (2021). Counter-Terrorism, Social Media, and the Regulation of Extremist Content. In Miller, S., Henschke, A., and Feltes, J.’s (Eds), Counter-Terrorism (pp. 116–128). Edward Elgar Publishing.

  • Zimmermann, A., & Lee-Stronach, C. (2022). Proceed with Caution. Canadian Journal of Philosophy, 52(1), 6–25.

    Article  Google Scholar 

Download references

Funding

None.

Author information

Authors and Affiliations

Authors

Contributions

BK and PF have both made substantial contributions to this manuscript.

Corresponding author

Correspondence to Bhanuraj Kashyap.

Ethics declarations

Ethics Approval and Consent to Participate

Not Applicable.

Consent for Publication

Consent from authors obtained.

Competing Interests

PF has received funds for an unrelated project from Meta and Facebook.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kashyap, B., Formosa, P. The Authority to Moderate: Social Media Moderation and its Limits. Philos. Technol. 36, 78 (2023). https://doi.org/10.1007/s13347-023-00685-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s13347-023-00685-w

Keywords

Navigation