Skip to main content
Log in

Hey, Google, leave those kids alone: Against hypernudging children in the age of big data

  • Open Forum
  • Published:
AI & SOCIETY Aims and scope Submit manuscript

Abstract

Children continue to be overlooked as a topic of concern in discussions around the ethical use of people’s data and information. Where children are the subject of such discussions, the focus is often primarily on privacy concerns and consent relating to the use of their data. This paper highlights the unique challenges children face when it comes to online interferences with their decision-making, primarily due to their vulnerability, impressionability, the increased likelihood of disclosing personal information online, and their developmental capacities. These traits allow for practices such as hypernudging to be executed on them more accurately and with more serious consequences, specifically by potentially undermining their autonomy. We argue that children are autonomous agents in the making and thus require additional special protections to ensure that the development of their autonomy is safeguarded. This means that measures should be taken to prohibit most forms of hypernudging children and thus ensure that they are protected from this powerful technique of digital manipulation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Availability of data and materials

Not applicable.

Code availability

Not applicable.

Notes

  1. For the purposes of this paper, “Big Data” refers to the gathering, processing, and analysing of this massive amount of data (see Floridi and Mittelstadt 2016, p. 309).

  2. The term “data cycle” can be understood as follows: data providers use online services, data collectors gather data produced on online services, data analysts process this data into information, data users make use of this information for different purposes (marketing, insurance, and so on). See Berman and Albright (2017, p. 23) for a detailed breakdown.

  3. We will address the issue of what makes interference illegitimate in more detail below.

  4. Also see Arneson (1991); Mele (1995), Cohen (2000) and Christman (2020).

  5. Admittedly, this formulation is vague, but it is sufficient for our purposes, as we are setting the bar for “undue” influence relatively high—i.e., influence that constitutes manipulation and coercion.

  6. Also note that responsibility here refers to “role responsibility”, where someone can be held to be counterfactually responsible for their actions, regardless of any moral responsibility that may or may not pertain (see Genus and Stirling 2018, p. 62).

  7. The distinction between one’s actions being authentically one’s own and one’s being responsible for such decisions and actions can be subtle and is much contested in the literature. For our purposes, it is sufficient to note that while I may authentically choose to raise my arm at time t, it is still possible for me to not be responsible for raising my arm at t, if it happens that someone else comes along and raises my arm for me at t.

  8. It should be noted that various such models exist that differ in terms of detail and degrees of complexity. Here, we employ a maximally general conception of such a diachronic approach without committing ourselves to the finer-grained particulars. It is our contention that the practice of hypernudging children will fall foul on any such a model.

  9. See Cave (2007) for an extensive discussion.

  10. Various theorists have made the case that historical approaches to autonomy are better placed than structural accounts to deal with the ways in which one can be manipulated into making non-authentic (but seemingly autonomous) decisions. See Christman (1991, 2020), Mele (2001), and Cave (2007).

  11. See Dworkin (1988) who provides an illuminating discussion on how one can be autonomous while still being influenced, to some extent, by others.

  12. “Primary values” here refer to the values that no liberal democracy could be sustained without, including life, happiness, freedom, knowledge, ability, resources, and security Moor (1997, p. 29).

  13. We use the status definition “children” developed by Brighouse and Swift (2014, p. 62), which argues that for an individual to be considered a child, they must possess four necessary features: vulnerability, dependence, no conception of value and the “capacity to develop into nonvulnerable and independent adults”.

  14. With “paternalism” here we have in mind violations of a person’s autonomy for the good of that person (see Dworkin 1988, p. 123).

  15. E.g., Dixon and Nussbaum (2012), Gheaus (2018), Giesinger (2019), Macleod (2019), Schweiger (2019).

  16. Following (Yeung 2017, p. 119) we will refer to companies like these as the “Big Data barons”.

  17. Lupton (2017) gives an account of how pregnancy apps bring in utero fetuses into the data cycle.

  18. Attempts at this kind of digital manipulation are not uncommon. See Lanzing (2019); Susser et al. (2019); Shah et al. (2015); Chen and Quan-Haase (2020); Richterich (2018) and Taylor (2020).

  19. See Giesinger (2019, p. 221).

  20. For example, humans have a heuristic that makes them more likely to choose foods placed at the front of a selection. So, if the management of a school cafeteria want the pupils to eat healthier foods, they do not have to exclude unhealthy foods from the selection. By placing the healthy foods at the front of the table and unhealthy foods at the back, the pupils will unconsciously choose more of the healthy foods (Thaler and Sunstein 2008: 1).

  21. Tahir et al.. (2019), analysed 5000 videos on YouTube Kids and found 20% to contain “fake, explicit, or violent content”.

  22. While the effectiveness of this particular campaign may be difficult to determine, empirical studies suggest that targeted, “digital mass-persuasion” using harvested data is not only possible but effective in affecting the behaviour of its targets (e.g. Matz et al 2017).

  23. Among other campaigns, Cambridge Analytica used data captured from the Facebook pages of American voters to determine which voters were “persuadable” and used targeted advertising to attempt to sway these voters to vote for Donald Trump in the 2016 US Elections (see Markham et al. 2018; Susser et al. 2019). They claim to have done something similar in other countries.

  24. The Facebook Emotional Experiment was an experiment conducted by Facebook employees in collaboration with researchers at top US universities. The experiment tested whether Facebook users could be made to experience negative emotions if their news feeds were filtered to only include negative content. The experiment was successful; however, this resulted in over 300,000 Facebook users being guided to experience negative emotions (see Panger 2016; Shaw 2016).

  25. Carolan (2018).

  26. A case could be made that such a company could be required to pass such information on to an appropriate authority, but this issue lies beyond the scope of this paper.

  27. According to the General Data Protection Regulation of the European Union, consent must be freely given, informed, specific, and unambiguous for it to be deemed legitimate (Van der Hof 2017, p. 128). Such legitimate consent would require an autonomous individual to provide it without coercion, influence, or manipulation.

References

Download references

Acknowledgements

James Smith is grateful to the School for Data Science and Computational Thinking at Stellenbosch University for their generous bursary which has enabled him to pursue his postgraduate studies.

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

Not applicable.

Corresponding author

Correspondence to James Smith.

Ethics declarations

Conflict of interest

Not applicable.

Ethics approval

Not applicable.

Consent to participate

Not applicable.

Consent for publication

Not applicable.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Smith, J., de Villiers-Botha, T. Hey, Google, leave those kids alone: Against hypernudging children in the age of big data. AI & Soc 38, 1639–1649 (2023). https://doi.org/10.1007/s00146-021-01314-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00146-021-01314-w

Keywords

Navigation