Skip to main content
Log in

Inferences and the Right to Privacy

  • Regular Paper
  • Published:
The Journal of Value Inquiry Aims and scope Submit manuscript

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Notes

  1. By ‘legitimate’ I mean normatively legitimate, not epistemically legitimate.

  2. In recent years, there has been a legal discussion on whether inferences of personal information should be covered by the legal right to privacy (Wachter 2019; European Court of Justice 2017; Wachter & Mittelstadt 2019). To the extent that law should reflect morality, the Inference Principle has direct implications for this legal discussion.

  3. Throughout this paper, I shall assume for the sake of argument that privacy rights exist. I shall not commit to any particular view on what a moral right in general consists in, or what the relation in general is between rights and duties. Neither shall I commit to any particular view on whether the right to privacy is an absolute right or not.

  4. Barocas & Nissenbaum 2014, p. 44.

  5. Turkson et al. 2016; Kearns & Roth 2020.

  6. Price & Cohen 2019.

  7. Berk & Hyatt 2015. See also Lin et al. 2020 for recent skepticism about the accuracy of these algorithms.

  8. Tadesse et al. 2018.

  9. There is no consensus in the literature on what the right to informational privacy is, and what counts as a violation of this right. So-called control theorists believe that an agent’s right to privacy is violated when she loses the right kind of control over her personal information (or over the access to this information). For different versions of the control theory, see e.g. Moore 2003; Moore 2010; Inness 1992; Fried 1968; Parent 1983; Marmor 2015; Mainz & Uhrenfeldt 2020 and Menges 2020. So-called access theorists often add the extra necessary condition that someone must actually access the agent’s personal matters in order for her right to privacy to be violated. See e.g. Thomson 1975; Macnish 2018 and Lundgren 2020. For present purposes, I shall remain agnostic about which of these theories, if any, is true. However, the argument I make in this paper may have revisionary implications for some of these theories.

  10. Wachter 2019; Wachter & Mittelstadt 2019; Rumbold & Wilson 2019; Alben 2020; Barocas & Nissenbaum 2014 and Kröger (2019).

  11. Rumbold & Wilson 2019, p. 3.

  12. Floridi 2006, p. 116.

  13. To be clear, this is not to suggest that inferences cannot diminish an agent’s privacy in a non-normative sense.

  14. Note that the Inference Principle does not only involve ‘personal’ information. One reason for this is that it is notoriously difficult to distinguish personal information from non-personal information. A second reason is that pieces of information that are clearly personal can often be inferred from pieces of information that are clearly non-personal (Barocas & Nissenbaum 2014, p. 55). A third reason is that the principle also covers information that is completely non-personal in nature, regardless of where we draw the line between personal- and non-personal information.

  15. The principle concerns agents in general, not only individuals. Nevertheless, throughout the paper, I will mostly talk about information about individuals, and inferences made by individuals.

  16. For the sake of argument, set aside the off chance that Smith is on dialysis only because he likes it, has been forced to do it, or something similar.

  17. Conditional elimination is the inference rule at work in standard modus ponens arguments of the form ‘if p then q, p, therefore q’. It makes no relevant difference what exact inference rule is at play, or if the inference rule is a deductive one or not. The reader can easily construct different inferences involving different inference rules.

  18. There are two competing views in the literature on what it takes to waive one’s right to privacy. The first view holds that the right to privacy is limited to information that the right-holder has not intentionally made public. For discussion of this view, see Thomson 1975; Reiman 1976; Fried 1968; Schoeman 1984; Parent 1983; Ryberg 2007. The second view holds that the right to privacy at least sometimes extents to information that the right-holder has intentionally made public. For discussion of this view, see Nissenbaum 1998, 2009; Stahl 2020; Timan et al. 2017; Roessler 2016; Newell et al. 2018; Moreham 2006; Reidenberg 2014; Rumbold & Wilson 2019; and Margulis 2003.

    For the purpose of this paper, I need not commit to a particular view on what is required to waive one’s right to privacy. Regardless of what the correct view is, the Inference Principle implies that if an individual holds some information in accordance with this view, then the individual may infer any information from it without violating anyone’s right to privacy. I remain non-committal about what is required in order to come to hold the original information legitimately.

  19. What if Tom had asked for information (α) and (β) knowing that he can draw inference (γ) from, while knowing that Smith does not know this, and while Smith would prefer that Tom did not know (γ)? In this case, we might say that Tom had obtained information (α) and (β) illegitimately by deceiving Smith, and that the Inference Principle therefore does not apply. I thank an anonymous reviewer for suggesting this point to me.

  20. This is not to suggest that the Inference Principle only applies if one piece of information is inferred from two pieces of information. The number of members in the respective sets are not important. If, for instance, I hold the information legitimately that all men have a significant risk of getting testicular cancer, then I hold the information that Smith has a significant risk of testicular getting cancer without violating Smith’s right to privacy.

  21. Car choice is in fact a good predictor of political preferences. Owners of pickup trucks are generally likely to vote Republican, and owners of sedans are generally likely to vote Democrat. See Gebru et al. 2017.

  22. The model might output a precise estimation of the likelihood of Jones voting Republican. It might, for instance, output that Jones is 85% likely to vote Republican.

  23. Nozick 1974, p. 151.

  24. Nozick 1974, p. 153–155.

  25. If the inference was not made correctly, then it might have generated a false belief in Tom’s mind. Theorists who follow Prosser’s theory of the right to privacy might argue that this would violate Smith’s right to privacy (Prosser 1960, p. 389). I find it strange, though, that producing false beliefs about other people should violate their right to privacy, but for the sake of argument, I simply stipulate that Tom makes the correct inference.

  26. See Nozick’s own discussion of this objection in Nozick 1974, p. 152–153.

  27. See Nozick 1974, p. 228–231.

  28. Mainz 2020, p. 5.

  29. Some authors do indeed seem to think that having certain thoughts can be harmful to others, because of downstream consequences caused by the thoughts (See Mendlow 2018; Dan-Cohen 1999). Others believe, perhaps controversially, that having certain thoughts can be wrongful in itself, despite the lack of any upstream or downstream explanations (Schroeder & Basu 2018). Schroeder & Basu touch upon the idea that having certain beliefs about others may violate their right to privacy (Schroeder & Basu 2018), but the standard view seems to be that beliefs cannot constitute rights violations.

  30. Thomson 1975, p. 307.

  31. See e.g. Marmor 2015; Kappel 2013; Persson & Savulescu 2019. See, however, Munch 2021a for a critical discussion of this assumption.

  32. Plausibly, we do, however, sometimes have duties to have certain beliefs about others. However, many of these duties are explained by their downstream consequences. One may for instance have a doxastic duty to have a certain belief, if forming this belief is necessary to perform an action that one has a duty to perform. To illustrate, a medical doctor who has a duty to treat a patient has an appertaining doxastic duty to form a belief about, say, what disease the patient suffers from. Similarly, one may have a doxastic duty not to form certain beliefs, if not forming such beliefs if necessary to perform an action that one has a duty to perform. The medical doctor may have a doxastic duty not to form the belief that the patient suffers from a disease that she does not suffer from. However, failure to comply with doxastic duties like these does not, in itself, constitute a violation of a moral right to privacy.

  33. Note that on some views of the justification of privacy rights, the explanation for why some steps that lead to Q holding α, β, or γ are illegitimate have to do with the consequences of Q holding α, β, or γ. For instance, some privacy scholars think that the right to privacy is explained by an urgent moral interest in exercising control over how we present ourselves to others (See Marmor 2015). Other privacy scholars think the right to privacy is explained by an interest in avoiding that our personal information is somehow misused or exploited (See Parent 1983; Munch 2020), or because others’ access to our personal information somehow detriments our ability to autonomously form our identities, or detriments our ability to make autonomous decisions (See Feinberg 1986; Taylor 2002).

  34. Of course, there may be other objections to my argument. One candidate might be derived from the view recently defended by Lauritz Munch (2021b). He defends what he calls the ‘symmetry thesis’. According to this thesis, there are no good reasons to think that there are any privacy-related normative differences between standard cases where someone accesses someone else’s information by using an X-ray device, and cases where the exact same information is accessed through the means of statistical inferences. It is beyond the scope of this paper to provide a satisfying reply to Munch’s argument. However, I think that the Inference Principle offers a plausible explanation for why we often find X-ray cases objectionable, and statistical cases unobjectionable: If the information that the inference is based on are obtained legitimately, then the inference does not constitute a privacy violation.

  35. For a similar point, see Floridi 2006, p. 116.

  36. Rumbold & Wilson 2019, p. 12.

  37. Thomson 1975, p. 301.

  38. Rumbold & Wilson 2019, p. 10.

  39. Rumbold & Wilson 2019, p. 15.

  40. Rumbold & Wilson 2019, p. 4.

  41. Rumbold & Wilson 2019, p. 14.

  42. Rumbold & Wilson 2019, p. 14.

  43. Munch 2021b, p. 3780.

  44. However, as Munch notes, it is presumably desirable to minimize the occurrences of such misalignments. See Bolinger 2019 for discussion of this.

  45. One might object here that when one finds out that one had communicated consent unintentionally, one should be allowed to withdraw consent. While this is plausibly true in some situations, there are also situations in which we would normally say that one should have been more careful with what one communicated. By now, many people know that when they share personal information online, the information is used to make inferences. And when people choose to withdraw their consent from, say Google, we normally think that this means that Google should not make any further inferences, not that they violated the data subject’s right to privacy by making the inferences in the past. Thanks to an anonymous reviewer for pointing this out to me.

  46. Rumbold & Wilson 2019, 13.

  47. Rumbold & Wilson 2019, p. 14.

  48. I presume that as a matter of psychological fact, it is at least sometimes impossible to form a certain belief b (or refrain from forming b) at will. This view, or at least something close to it, is known as ‘Doxastic Involuntarism’. See Peels 2015; Antill 2020; Roeber 2019. I also presume that it is at least sometimes psychologically possible to know in advance that one would not be able to refrain from forming b if one was presented with evidence e.

  49. Munch has recently called a duty of this type an ‘indirect doxastic duty’ not to form a certain belief (Munch 2021a). It is an indirect doxastic duty because the duty consists in acting in a way that indirectly avoids forming the belief in question.

  50. Or, he might be under an obligation to become ignorant of the information about Smith that he already knows. Becoming ignorant of information that one already knows may be psychologically possible in epistemically non-drastic ways in at least some cases (Matheson 2013). But it is still normatively controversial to hold that one can be under an obligation to become ignorant of certain information. In the case of Jones and Tim, it would be very strange to claim that Tim has an obligation to become ignorant about either the information about what car Jones drives, or the information about statistical correlations between car choice and political preferences, given that Tim has come to know both pieces of information in legitimate ways. Jones even wants Tim to know what car he drives.

  51. Rumbold & Wilson 2019, p. 14.

  52. Another version of this objection can be found in Floridi 2006, p. 116.

  53. Facebook have created so-called ‘shadow profiles’ of people who do not have a Facebook profile. These profiles also contain inferred information about non-users based on information about users, and certain connections between users and non-users (Garcia 2017).

  54. Floridi 2006, p. 116.

  55. This is in fact true in many cases. See Pew Research Center 2014.

  56. See Susser et al. 2019 for a discussion of this view.

  57. Garcia 2017.

  58. Garcia 2017, p. 1.

References

  • Alben, Alexander. 2020. When Artificial Intelligence and Big Data Collide—How Data Aggregation and Predictive Machines Threaten our Privacy and Autonomy. AI Ethics Journal. 1 (1): 1–23.

    Article  Google Scholar 

  • Antill, Gregory. 2020. Epistemic Freedom Revisited. Synthese 197: 793–815.

    Article  Google Scholar 

  • Barocas, Solon, and Helen Nissenbaum. 2014. Big Data’s end run around anonymity and consent. In Privacy, Big Data, and the Public Good: Frameworks for Engagement, ed. Julia Lane, Victoria Stodden, Stefan Bender, and Helen Nissenbaum. Cambridge: Cambridge University Press.

    Google Scholar 

  • Berk, Richard, and Jordan Hyatt. 2015. Machine Learning Forecasts of Risk to Inform Sentencing Decisions. Federal Sentencing Reporter 27 (4): 222–228.

    Article  Google Scholar 

  • Bolinger, Renée. 2019. Moral risk and communicating consent. Philosophy and Public Affairs 47: 179–207.

    Article  Google Scholar 

  • Dan-Cohen, Meir. 1999. Harmful Thoughts. Law and Philosophy 18 (4): 379–405.

    Google Scholar 

  • European Court of Justice. 2017. Peter Nowak v Data Protection Commissioner Case C-434/16.

  • Feinberg, Joel. 1986. The Moral Limits of the Criminal Law, vol. 3: Harm to Self. Oxford: Oxford University Press.

  • Floridi, Luciano. 2006. Four Challenges for a Theory of Informational Privacy. Ethics and Information Technology. 8: 109–119.

    Article  Google Scholar 

  • Fried, Charles. 1968. Privacy. Yale Law Journal 77 (3): 475.

    Article  Google Scholar 

  • Garcia, David. 2017. Leaking Privacy and Shadow Profiles in Online Social Networks. Science Advances 3 (8): e1701172.

    Article  Google Scholar 

  • Gebru, Timnit, Jonathan Krause, Yilun Wang, Duyun Chen, Jia Deng, Erez Lieberman Aiden, and Li. Fei-Fei. 2017. Using deep learning and Google Street View to estimate the demographic makeup of neighborhoods across the United States. Proceedings of the National Academy of Sciences of the United States of America 114 (50): 13108.

    Article  Google Scholar 

  • Inness J. 1992. Privacy, Intimacy, and Isolation. Oxford University Press.

  • Kappel, Klemens. 2013. Epistemological Dimensions of Informational Privacy. Episteme 10 (2): 179–192.

    Article  Google Scholar 

  • Kearns, Michael, and Aaron Roth. 2020. The Ethical Algorithm. Oxford: Oxford University Press.

    Google Scholar 

  • Kröger J. 2019. Unexpected Inferences from Sensor Data: A Hidden Privacy Threat in the Internet of Things. In IFIP Advances in Information and Communication Technology. Vol. 548. Cham: Springer.

  • Lin, Zhiyuan, Jongbin Jung, Shared Goel, and Jennifer Skeem. 2020. The limits of human predictions of recidivism. Science Advances. https://doi.org/10.1126/sciadv.aaz0652.

    Article  Google Scholar 

  • Lundgren B. 2020. A dilemma for privacy as control. The Journal of Ethics.

  • Macnish, Kevin. 2018. Government Surveillance and Why Defining Privacy Matters in a Post-Snowden World. Journal of Applied Philosophy. 35 (2): 417–432.

    Article  Google Scholar 

  • Mainz, Jakob. 2020. But Anyone Can Mix Their Labor: A Reply to Cheneval. Critical Review of International Social and Political Philosophy. https://doi.org/10.1080/13698230.2020.1764786.

    Article  Google Scholar 

  • Mainz, Jakob, and Rasmus Uhrenfeldt. 2020. Too Much Info: Data Surveillance and Reasons to Favor the Control Account of the Right to Privacy. Res Publica 27: 287–302.

    Article  Google Scholar 

  • Margulis, S.T. 2003. Privacy as a Social Issue and Behavioral Concept. Journal of Social Issues 59 (2): 243–261.

    Article  Google Scholar 

  • Marmor, Andrei. 2015. What is the Right to Privacy? Philosophy & Public Affairs 43 (1): 3–26.

    Article  Google Scholar 

  • Matheson, David. 2013. A Duty of Ignorance. Episteme 10 (2): 193–205.

    Article  Google Scholar 

  • Mendlow, Gabriel. 2018. Why is It Wrong to Punish Thought? The Yale Law Journal 127 (8): 2204–2585.

    Google Scholar 

  • Menges, Leonhard. 2020. A Defense of Privacy as Control. Journal of Ethics. https://doi.org/10.1007/s10892-020-09351-1.

    Article  Google Scholar 

  • Moore, Adam D. 2003. Privacy: Its Meaning and Value. American Philosophical Quarterly 40 (3): 215–227.

    Google Scholar 

  • Moore, Adam D. 2010. Privacy Rights: Moral and Legal Foundations. University Park: Pennsylvania State University Press.

    Google Scholar 

  • Moreham, N. 2006. Privacy in Public Places. Cambridge Law Journal 65 (3): 606–635.

    Article  Google Scholar 

  • Munch, Lauritz. 2020. The Right to Privacy, Control Over Self-presentation, and Subsequent Harm. Journal of Applied Philosophy. 37 (1): 141–154.

    Article  Google Scholar 

  • Munch, Lauritz. 2021a. How Privacy Rights Engender Direct Doxastic Duties. The Journal of Value Inquiry. https://doi.org/10.1007/s10790-020-09790-x.

    Article  Google Scholar 

  • Munch, Lauritz. 2021b. Privacy Rights and ‘Naked’ Statistical Evidence. Philosophical Studies 178 (11): 3777–3795.

    Article  Google Scholar 

  • Newell, B., I. Skorvanak. T. Timan, and T. Chokrevski. 2018. A Typology of Privacy. University of Pennsylvania Journal of International Law 38 (2): Art. 4.

  • Nissenbaum, Helen. 1998. Protecting Privacy in an Information Age: The Problem of Privacy in Public. Law and Philosophy 17: 559–596.

    Google Scholar 

  • Nissenbaum, Helen. 2009. Privacy in Context: Technology, Policy, and the Integrityof Social Life. Stanford, CA: Stanford University Press.

    Book  Google Scholar 

  • Nozick, Robert. 1974. Anarchy, State, and Utopia. New York: Basic Books.

    Google Scholar 

  • Parent, William. 1983. Privacy, Morality, and Law. Philosophy & Public Affairs 12 (4): 269–288.

    Google Scholar 

  • Peels, Rik. 2015. Believing at Will Is Possible. Australasian Journal of Philosophy 93 (3): 524–541.

    Article  Google Scholar 

  • Persson, Ingmar, and Julian Savulescu. 2019. The Irrelevance of a Moral Right to Privacy for Biomedical Moral Enhancement. Neuroethics 12 (1): 35–37.

    Article  Google Scholar 

  • Pew Research Center. 2014. Political Polarization in the American Public. https://www.pewresearch.org/politics/2014/06/12/political-polarization-in-the-american-public/. Accessed 21 Jan 2021.

  • Price, W.N., and I.G. Cohen. 2019. Privacy in the age of medical big data. Nature Medicine 25: 37–43.

    Article  Google Scholar 

  • Prosser, William. 1960. Privacy. California Law Review. 48 (3): 383–423.

    Article  Google Scholar 

  • Reidenberg, Joel. 2014. Privacy in Public. University of Miami Law Review 69 (1): 141.

    Google Scholar 

  • Reiman, Jeffrey. 1976. Privacy, Intimacy, and Personhood. Philosophy & Public Affairs 6 (1): 26–44.

    Google Scholar 

  • Roeber, Blake. 2019. Evidence, Judgment, and Belief at Will. Mind 128 (511): 837–859.

    Article  Google Scholar 

  • Roessler, Beate. 2016. Privacy as a Human Right. Proceedings of the Aristotelian Society CXVII (2): 187–206.

    Article  Google Scholar 

  • Rumbold, Benedict, and James Wilson. 2019. Privacy Rights and Public Information. The Journal of Political Philosophy. 27 (1): 3–25.

    Article  Google Scholar 

  • Ryberg, Jesper. 2007. Privacy Rights, Crime Prevention, CCTV, and the Life of Mrs. Aremac. Res Publica 13: 127–143.

    Article  Google Scholar 

  • Schoeman, Ferdinand. 1984. Philosophical Dimensions of Privacy: An Anthology. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Schroeder, Mark, and Rima Basu. 2018. Doxastic Wronging. In Pragmatic Encroachment in Epistemology, ed. Brian Kim and Matthew McGrath. London: Routledge.

    Google Scholar 

  • Stahl, Titus. 2020. Privacy in Public: A Democratic Defense. Moral Philosophy and Politics 7 (1): 73–96.

    Article  Google Scholar 

  • Susser, Daniel, Beate Roessler, and Helen Nissenbaum. 2019. Online Manipulation: Hidden Influences in a Digital World. Georgetown Law Technology Review 4 (1): 1–45.

    Google Scholar 

  • Tadesse, Michael M., Hongfei Lin, Xu. Bo, and Liang Yang. 2018. Personality Predictions Based on User Behavior on the Facebook Social Media Platform. IEEE Access 6: 61959–61969.

    Article  Google Scholar 

  • Taylor, James Stacey. 2002. Privacy and Autonomy: A Reappraisal. Southern Journal of Philosophy. 40 (4): 587–604.

    Article  Google Scholar 

  • Tene, Omer, and Jules Polonetsky. 2013. Judged by the Tin Man: Individual Rights in the Age of Big Data. Journal on Telecommunications and High Technology Law 11: 351.

    Google Scholar 

  • Thomson, Judith Jarvis. 1975. The Right to Privacy. Philosophy and Public Affairs 4 (4): 295–314.

    Google Scholar 

  • Timan, T., B. Newell, and B. Koops. 2017. Privacy in Public Space: Conceptual and Regulatory Challenges. Cheltenham: Edward Elgar Publishing.

  • Turkson, R. E., E. Y. Baagyere, and G. E. Wenya. 2016. A machine learning approach for predicting bank credit worthiness. In Third International Conference on Artificial Intelligence and Pattern Recognition (AIPR), pp. 1–7.

  • Wachter, Sandra. 2019. Data Protection in the age of Big Data. Nature Electronics 2: 6–7.

    Article  Google Scholar 

  • Wachter, Sandra, and Brent Mittelstadt. 2019. A Right to Reasonable Inferences: Re-thinking Data Protection Law in the Age of Big Data and AI. Columbia Law Review 2: 1–182.

    Google Scholar 

  • Wrenn, C. 2007. Why There Are No Epistemic Duties. Dialogue 46 (1): 115–136.

    Article  Google Scholar 

Download references

Funding

This study was supported by Carlsbergfondet (Grant No. CF20-0257).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jakob Mainz.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

I would like to thank Kasper Lippert-Rasmussen, Reuben Binns, Frej Klem Thomsen, Simon Laumann Jørgensen, Jens Damgaard Thaysen, Jes Lynning Harfeld, Jørn Sønderholm, Rasmus Uhrenfeldt, Lauritz Munch, and an anonymous reviewer for very helpful discussions of earlier versions of this paper.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mainz, J. Inferences and the Right to Privacy. J Value Inquiry (2022). https://doi.org/10.1007/s10790-022-09911-8

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10790-022-09911-8

Navigation