Abstract
Children continue to be overlooked as a topic of concern in discussions around the ethical use of people’s data and information. Where children are the subject of such discussions, the focus is often primarily on privacy concerns and consent relating to the use of their data. This paper highlights the unique challenges children face when it comes to online interferences with their decision-making, primarily due to their vulnerability, impressionability, the increased likelihood of disclosing personal information online, and their developmental capacities. These traits allow for practices such as hypernudging to be executed on them more accurately and with more serious consequences, specifically by potentially undermining their autonomy. We argue that children are autonomous agents in the making and thus require additional special protections to ensure that the development of their autonomy is safeguarded. This means that measures should be taken to prohibit most forms of hypernudging children and thus ensure that they are protected from this powerful technique of digital manipulation.
Similar content being viewed by others
Availability of data and materials
Not applicable.
Code availability
Not applicable.
Notes
For the purposes of this paper, “Big Data” refers to the gathering, processing, and analysing of this massive amount of data (see Floridi and Mittelstadt 2016, p. 309).
The term “data cycle” can be understood as follows: data providers use online services, data collectors gather data produced on online services, data analysts process this data into information, data users make use of this information for different purposes (marketing, insurance, and so on). See Berman and Albright (2017, p. 23) for a detailed breakdown.
We will address the issue of what makes interference illegitimate in more detail below.
Admittedly, this formulation is vague, but it is sufficient for our purposes, as we are setting the bar for “undue” influence relatively high—i.e., influence that constitutes manipulation and coercion.
Also note that responsibility here refers to “role responsibility”, where someone can be held to be counterfactually responsible for their actions, regardless of any moral responsibility that may or may not pertain (see Genus and Stirling 2018, p. 62).
The distinction between one’s actions being authentically one’s own and one’s being responsible for such decisions and actions can be subtle and is much contested in the literature. For our purposes, it is sufficient to note that while I may authentically choose to raise my arm at time t, it is still possible for me to not be responsible for raising my arm at t, if it happens that someone else comes along and raises my arm for me at t.
It should be noted that various such models exist that differ in terms of detail and degrees of complexity. Here, we employ a maximally general conception of such a diachronic approach without committing ourselves to the finer-grained particulars. It is our contention that the practice of hypernudging children will fall foul on any such a model.
See Cave (2007) for an extensive discussion.
See Dworkin (1988) who provides an illuminating discussion on how one can be autonomous while still being influenced, to some extent, by others.
“Primary values” here refer to the values that no liberal democracy could be sustained without, including life, happiness, freedom, knowledge, ability, resources, and security Moor (1997, p. 29).
We use the status definition “children” developed by Brighouse and Swift (2014, p. 62), which argues that for an individual to be considered a child, they must possess four necessary features: vulnerability, dependence, no conception of value and the “capacity to develop into nonvulnerable and independent adults”.
With “paternalism” here we have in mind violations of a person’s autonomy for the good of that person (see Dworkin 1988, p. 123).
Following (Yeung 2017, p. 119) we will refer to companies like these as the “Big Data barons”.
Lupton (2017) gives an account of how pregnancy apps bring in utero fetuses into the data cycle.
See Giesinger (2019, p. 221).
For example, humans have a heuristic that makes them more likely to choose foods placed at the front of a selection. So, if the management of a school cafeteria want the pupils to eat healthier foods, they do not have to exclude unhealthy foods from the selection. By placing the healthy foods at the front of the table and unhealthy foods at the back, the pupils will unconsciously choose more of the healthy foods (Thaler and Sunstein 2008: 1).
Tahir et al.. (2019), analysed 5000 videos on YouTube Kids and found 20% to contain “fake, explicit, or violent content”.
While the effectiveness of this particular campaign may be difficult to determine, empirical studies suggest that targeted, “digital mass-persuasion” using harvested data is not only possible but effective in affecting the behaviour of its targets (e.g. Matz et al 2017).
Among other campaigns, Cambridge Analytica used data captured from the Facebook pages of American voters to determine which voters were “persuadable” and used targeted advertising to attempt to sway these voters to vote for Donald Trump in the 2016 US Elections (see Markham et al. 2018; Susser et al. 2019). They claim to have done something similar in other countries.
The Facebook Emotional Experiment was an experiment conducted by Facebook employees in collaboration with researchers at top US universities. The experiment tested whether Facebook users could be made to experience negative emotions if their news feeds were filtered to only include negative content. The experiment was successful; however, this resulted in over 300,000 Facebook users being guided to experience negative emotions (see Panger 2016; Shaw 2016).
Carolan (2018).
A case could be made that such a company could be required to pass such information on to an appropriate authority, but this issue lies beyond the scope of this paper.
According to the General Data Protection Regulation of the European Union, consent must be freely given, informed, specific, and unambiguous for it to be deemed legitimate (Van der Hof 2017, p. 128). Such legitimate consent would require an autonomous individual to provide it without coercion, influence, or manipulation.
References
Arneson R (1991) Autonomy and preference formation. In: Coleman J, Buchanan A (eds) In Harm’s way: essays in honor of Joel Feinberg. Cambridge University Press, Cambridge, pp 42–73
BBC News (2018) Cambridge analytica-linked firm ‘boasted of poll interference’ [online] Available at https://www.bbc.com/news/uk-43528219. Accessed 8 June 2021
BBC News (2021) Instagram for kids paused after backlash [online]. Available at https://www.bbc.com/news/technology-58707753. Accessed 28 September 2021
Berman G, Albright K (2017) Children and the data cycle: rights and ethics in a big data world. Innocenti Working Paper 2017-05. Florence. https://doi.org/10.13140/RG.2.2.20603.52008
Brighouse H, Swift A (2014) Family values: the ethics of parent-child relationships. Princeton University Press, Princeton, Oxford
Carolan M (2018) Big data and food retail: nudging out citizens by creating dependent consumers. Geoforum 90:142–150. https://doi.org/10.1016/j.geoforum.2018.02.006
Cave E (2007) What’s wrong with motive manipulation? Ethical Theory Moral Pract 10(2):129–144
Chaudron S et al (2017) Kaleidoscope on the internet of toys: safety, security, privacy and societal insights. https://doi.org/10.2788/05383
Chen W, Quan-Haase A (2020) Big data ethics and politics: toward new understandings. Soc Sci Comput Rev 38(1):3–9. https://doi.org/10.1177/0894439318810734
Christman J (1991) Autonomy and personal history. Can J Philos 21(1):1–24
Christman J (2006) Relational autonomy and the social dynamics of paternalism. Ethical Theory Moral Pract 17(3):369–382
Christman J (2020) Autonomy in moral and political philosophy. Stanf Encycl Philos. https://doi.org/10.5860/choice.41sup-0181
Chung G, Grimes SM (2005) Data Mining the kids: surveillance and market research strategies in children’s online games. Can J Commun 30(4):527–548. https://doi.org/10.22230/cjc.2005v30n4a1525
Cohen JE (2000) Examined lives: informational privacy and the subject as object. Georgetown Law Faculty Publications and Other Works, pp 1373–1438. https://doi.org/10.4324/9781351154161-12
Dixon R, Nussbaum MC (2012) Children’s rights and a capabilities approach: the question of special priority. Cornell Law Rev 97(3):549–594
Dworkin G (1988) The theory and practice of autonomy. Cambridge University Press, Cambridge
Feinberg J (1986) The moral limits of the criminal law: harm to self. Oxford University Press, Oxford
Fischer JM, Ravizza M (1988) Responsibility and control. a theory of moral responsibility. Cambridge University Press, Cambridge
Floridi L (ed) (2015) Introduction, the onlife manifesto: being human in a hyperconnected era. Springer, New York. https://doi.org/10.1007/978-3-319-04093-6_21
Floridi L, Taddeo M (2016) What is data ethics? Lancet 130(3344):680. https://doi.org/10.1016/S0140-6736(02)10308-4
Genus A, Stirling A (2018) Collingridge and the dilemma of control: towards responsible and accountable innovation. Res Policy 47(1):61–69. https://doi.org/10.1016/j.respol.2017.09.012
Gheaus A (2018) Children’s vulnerability and legitimate authority over children. J Appl Philos 35:60–75. https://doi.org/10.1111/japp.12262
Giesinger J (2019) Vulnerability and autonomy—children and adults. Ethics Soc Welfare 13(3):216–229. https://doi.org/10.1080/17496535.2019.1647262
Gorshkova N, Robaina-Calderin L, Martin-Santana JD (2020) Paradigm shifts in ICT ethics proceedings of the ETHICOMP 2020. In: Pelegrín-Borondo J et al (eds) Native advertising: ethical aspects of kid influencers on Youtube. Universidad de La Rioja, La Rioja, pp 169–170
Grafanaki S (2017) Autonomy challenges in the age of big data. Fordham Intellect Prop Media Entertain Law J 27(4):803–868
Hilder P (2019) ‘They were planning on stealing the election’: explosive new tapes reveal Cambridge Analytica CEO’s boasts of voter suppression, manipulation and bribery [online]. openDemocracy. Available at https://www.opendemocracy.net/en/dark-money-investigations/they-were-planning-on-stealing-election-explosive-new-tapes-reveal-cambridg/. Accessed 8 June 2021
Holloway D, Green L (2016) The internet of toys. Commun Res Pract 2(4):506–519. https://doi.org/10.1080/22041451.2016.1266124
House of Commons Digital, Culture, Media and Sport Committee (2019) ‘Disinformation and ‘fake news’: Final Report’, Eighth Report of Session 2017–19. Available at: https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/1791/179102.html. Accessed 15 June 2021
IDC (2020) IDC’s global datasphere forecast shows continued steady growth in the creation and consumption of data [online]. Available at https://www.idc.com/getdoc.jsp?containerId=prUS46286020. Accessed 8 June 2021
Kahneman D (2011) Thinking fast and slow. Farrar, Straus and Giroux, New York
Kelly H (2021) Instagram is making a kids’ app. Here’s what parents need to know about social media Jr. [online] washingtonpost.com. Available at https://www.washingtonpost.com/technology/2021/03/24/instagram-kids-faq/. Accessed 11 June 2021
Keymolen E, Van der Hof S (2019) Can I still trust you, my dear doll? A philosophical and legal exploration of smart toys and trust. J Cyber Policy 4(2):143–159. https://doi.org/10.1080/23738871.2019.1586970
Lanzing M (2019) “Strongly recommended” revisiting decisional privacy to judge hypernudging in self-tracking technologies. Philos Technol. 32:549–568. https://doi.org/10.1007/s13347-018-0316-4
Lupton D (2017) “It just gives me a bit of peace of mind”: Australian women’s use of digital media for pregnancy and early motherhood. Societies 7(25):1–13. https://doi.org/10.3390/soc7030025
Macleod C (2019) Paradoxes of children’s vulnerability. Ethics Soc Welfare 13(3):261–271. https://doi.org/10.1080/17496535.2019.1630465
Markham AN, Tiidenberg K, Herman A (2018) Ethics as methods: doing ethics in the era of big data research—introduction. Social Media Soc 4(3):1–9. https://doi.org/10.1177/2056305118784502
Matz SC, Kosinski M, Nave G, Stillwell DJ (2017) Psychological targeting in digital mass persuasion. Proc Natl Acad 114(48):12714–12719. https://doi.org/10.1073/pnas.1710966114
McAfee A, Brynjolfsson E (2012) Big data: the management revolution. Harvard Bus Rev October, pp. 1–9. Available at http://tarjomefa.com/wp-content/uploads/2017/04/6539-English-TarjomeFa-1.pdf
McReynolds E, Hubbard S, Lau T, Saraf A, Cakmak M, Roesner F (2017) Toys that listen: a study of parents, children, and internet-connected toys. In: Conference on human factors in computing systems, Denver. https://doi.org/10.1145/3025453.3025735
Mele AR (1995) Autonomous agents: from self-control to autonomy. Oxford University Press, Oxford
Mills S (2019) Into Hyperspace: an analysis of hypernudges and personalised behavioural science. Available at SSRN https://ssrn.com/abstract=3420211
Mittelstadt BD, Floridi L (2015) The ethics of big data: current and foreseeable issues in biomedical contexts. Sci Eng Ethics 22:303–341. https://doi.org/10.1007/s11948-015-9652-2
Montgomery KC, Chester J, Milosevic T (2017) Children’s privacy in the big data era: Research opportunities. Pediatrics 140(2):117–121. https://doi.org/10.1542/peds.2016-1758O
Nissenbaum H (2011) A contextual approach to privacy online. Dædalus 140(4):32–48. https://doi.org/10.3233/978-1-61499-057-4-219
Panger G (2016) Reassessing the Facebook experiment: critical thinking about the validity of Big Data research. Inf Commun Soc 19(8):1108–1126. https://doi.org/10.1080/1369118X.2015.1093525
Protection of Personal Information Act No. 4 of 2013
Renda A (2020) Europe: toward a policy framework for trustworthy AI. In: Dubber MD, Pasquale F, Das S (eds) The Oxford handbook of ethics of AI. Oxford University Press, Oxford
Richterich A (2018) The big data agenda: data ethics and critical data studies. University of Westminster Press, London, pp 33–51
Sætra HS (2019) When nudge comes to shove: liberty and nudging in the era of big data. Technol Soc 59:1–10. https://doi.org/10.1016/j.techsoc.2019.04.006
Schweiger G (2019) Ethics, poverty and children’s vulnerability. Ethics Social Welfare 13(3):288–330. https://doi.org/10.1080/17496535.2019.1593480
Shah DV, Cappella Ramesh JN, Neuman WR (2015) Big data, digital media, and computational social science: possibilities and perils. Ann Am Acad Polit Soc Sci 659(1):6–13. https://doi.org/10.1177/0002716215572084
Shaw D (2016) Facebook’s flawed emotion experiment: antisocial research on social network users. Res Ethics 12(1):29–34. https://doi.org/10.1177/1747016115579535
Simons J, Ghosh D (2020) Utilities for democracy: Why and how the algorithmic infrastructure of Facebook and Google must be regulated. Brookings Institution. Available at https://scholar.harvard.edu/dipayan/publications/utilities-democracy-why-and-how-algorithmic-infrastructure-facebook-and-google
Solove DJ (2013) Introduction: privacy self-management and the consent dilemma. Harv Law Rev 126(7):1880–1903
Steeves V (2020) A dialogic analysis of Hello Barbie’s conversations with children. Big Data Soc 7(1):2–12
Susser D, Roessler B, Nissenbaum H (2019) Online manipulation: hidden influences in a digital world. Georgetown Law Technol Rev 4(1):1–52
Taylor R (2020) Review of online targeting: final report and recommendations.
Thaler R, Sunstein C (2008) Nudge: improving decisions about health, wealth, and happiness. Penguin Books, New York
Tollon F (2021) Designed to seduce: epistemically retrograde ideation and YouTube’s recommender system. Int J Technoethics 12(2):60–71
Van der Hof S (2017) I agree... or do i? a rights-based analysis of the law on children’s consent in the digital world. Wis Int Law J 34(2):409–445
Véliz C (2020) Privacy is power. Penguin (Bantam Press), London, UK
Walker L (2015) Meet Hello Barbie: a wi-fi doll that talks to children. Newsweek, 17 February. Available at: https://www.newsweek.com/meet-hello-barbie-wi-fi-doll-talks-children-307482
Yeung K (2017) ‘“Hypernudge”: Big Data as a mode of regulation by design. Inf Commun Soc 20(1):118–136. https://doi.org/10.1080/1369118X.2016.1186713
Acknowledgements
James Smith is grateful to the School for Data Science and Computational Thinking at Stellenbosch University for their generous bursary which has enabled him to pursue his postgraduate studies.
Funding
Not applicable.
Author information
Authors and Affiliations
Contributions
Not applicable.
Corresponding author
Ethics declarations
Conflict of interest
Not applicable.
Ethics approval
Not applicable.
Consent to participate
Not applicable.
Consent for publication
Not applicable.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Smith, J., de Villiers-Botha, T. Hey, Google, leave those kids alone: Against hypernudging children in the age of big data. AI & Soc 38, 1639–1649 (2023). https://doi.org/10.1007/s00146-021-01314-w
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00146-021-01314-w