Abstract
This article reviews the qualitative changes that big data technology introduced to society, particularly changes that affect how individuals control the access, use and retention of their personal data. In particular interest is whether the practice of privacy self-management in this new context could still ensure the informed consent of individuals to the privacy terms of big data companies. It is concluded that that accepting big data companies’ privacy policies falls short of the disclosure and understanding requirements for informed consent. The article argues that the practice of privacy self-management could remain viable if the review, understanding and acceptance of privacy agreements is streamlined, standardized and automated. Technology should be employed to counter the privacy problems created by big data technology. The creation of the privacy exchange authorities (PEA) is proposed as a solution to the failures of privacy self-management. The PEA are intermediaries that empower individuals to define their own privacy terms and express informed consent in their dealings with data companies. They will create the technological infrastructure for individuals to select their own privacy terms from a list of standard choices, potentially only once. The PEA will further mediate the delivery and authentication of the individual users’ privacy terms to data companies. A logical proof of concept is offered, illustrating the potential steps involved in the creation of the PEA.
Similar content being viewed by others
Notes
A recent survey of the National Telecommunications & Information Administration indicates that “Americans are increasingly concerned about online security and privacy at a time when data breaches, cybersecurity incidents, and controversies over the privacy of online services have become more prominent. These concerns are prompting some Americans to limit their online activity” (Goldberg 2016).
The term “logical” here is used in the information system sense of logical (high level process definition) versus physical (low level technical implementation).
Barocas and Nissenbaum define big data not as a technology, method, or practice, but rather, as “a paradigm… a way of thinking about knowledge through data and a framework for supporting decision making, rationalizing action, and guiding practice” (2014, p. 46).
As Mayer-Schonberger and Cukier put it, “Before big data, our analysis was usually limited to testing a small number of hypotheses that we defined well before we even collected the data. When we let the data speak, we can make connections that we had never thought existed” (2013, p. 14). Consider the example of data scientists at Google developing a model correlating the search frequency of 45 terms and the spread of flu epidemics (Ginsburg et al. 2009).
See also Ohm discussing reidentification (2010, pp. 1703–1704).
Philip Howard argues that a new world order is emerging, a “pact between big technology firms and government…. The pax technica is a political, economic, and cultural arrangement of institutions and networked devices in which government and industry are tightly bound in mutual defense pacts, design collaborations, standards setting, and data mining” (2015, pp. 19–20).
This case provides a great tool for the analysis of controversial big data practices, such as Google’s scanning of personal emails to determine what advertisements to display to the user. Consider a third case, where the spy trains a listening device on your house, and translates the conversation not into spoken or written language but a series of images and short films, which are in fact advertisements related to the content of your conversation. It appears that the third spy also violates your privacy, and all three cases are identical from an ethical point of view. Displaying diaper advertisements to a secretly pregnant teenage girl could out her secret. Enough observations of targeted advertisements may enable a degree of reverse translation, or inference of the original messages.
Barocas and Nissenbaum also conclude that valid informed consent is impossible in the context of big data. They identify what they call the transparency paradox. Complex policy terms are impossible to comprehend, but simplified terms are inadequate to inform. “For individuals to make considered decisions about privacy in this environment, they need to be informed about the types of information being collected, with whom it is shared, under what constraints, and for what purposes…. Simplified, plain-language notices cannot provide information that people need to make such decisions” (2014, pp. 58–59).
According to Alexis Madrigal, “If it was your job to read privacy policies for 8 h per day, it would take you 76 work days to complete the task” (2012).
Strauss reports about multi-million dollar lobbying efforts by Pearson and others to shape public opinion, promote pro-high stakes testing and resist student privacy legislations (2015).
Woodrow Harzog offers to build chain-link confidentiality into contracts in order to counteract downstream privacy violations. “To create the chain of protection, contracts would be used to link each new recipient of information to a previous recipient who wished to disclose the information” (2012, p. 683).
Daniel Solove reaches similar conclusion. According to him, “privacy self-management faces several problems that together demonstrate that this paradigm alone cannot serve as the centerpiece of a viable privacy regulatory regime… cognitive problems, which concern challenges caused by the way humans make decisions, and… structural problems, which concern challenges arising from how privacy decisions are designed” (2013, p. 1883).
See Schneier (2015), p. 132.
This method of integrating third party authentication interface is popular and is used, for instance, by PayPal to initiate a secure payment for eBay purchases without exposing the user’s payment method to the vendor. DocuSign also uses the same method to store users’ digital signature.
This is a non-technical explanation of public key cryptography. It “uses two keys—a public key known to everyone and a private or secret key known only to the recipient of the message. When John wants to send a secure message to Jane, he uses Jane's public key to encrypt the message. Jane then uses her private key to decrypt it (Beal 2016).
Elizabeth Dwoskin reports a shift away from excessive data collection among new technology companies. She quotes Larry Gadea, founder and chief executive of Envoy, which develops visitor registration software: “We have to keep as little [information] as possible so that even if the government or some other entity wanted access to it, we’d be able to say we don’t have it” (2016). This trend intensified in the wake of the FBI's dispute with Apple, and the strong pro-customer privacy security stance taken by Apple. See Cook (2016).
References
Apache Software Foundation. (2014). Welcome to Apache™ Hadoop®! https://hadoop.apache.org/index.pdf. Accessed 14 January 2016.
Barocas, S., & Nissenbaum, H. (2014). Big data’s end run around anonymity and consent. In J. Lane, V. Stodden, S. Bender, & H. Nissenbaum (Eds.), Privacy, big data, and the public good: Frameworks for engagement (pp. 44–75). New York: Cambridge University Press.
Beal, V. (2016). Public key cryptography. http://www.webopedia.com/TERM/P/public_key_cryptography.html. Accessed 17 Aug 2016.
Beauchamp, T. L., & Childress, J. F. (2009). Principles of biomedical ethics. New York: Oxford University Press.
Boninger, F., & Molnar, A. (2016). Learning to be watched: Surveillance culture at school. The eighteenth annual report on schoolhouse commercialism trends. National Center for Education Policy at the University of Colorado at Boulder. http://nepc.colorado.edu/files/publications/RB%20Boninger-Molnar%20Trends.pdf. Accessed 17 May 2016.
Brunton, F., & Nissenbaum, H. (2013). Political and ethical perspectives on data obfuscation. In M. Hildebrandt & K. De Vries (Eds.), Privacy, due process and the computational turn: The philosophy of law meets the philosophy of technology (pp. 164–188). New York: Routledge.
Calo, R. (2014). Digital market manipulation. George Washington Law Review, 82, 995–1051.
Cook, T. (2016). A message to our customers. https://www.apple.com/customer-letter/. Accessed 17 May 2016.
Dwoskin, E. (2016). A shift away from big data: Tech firms race to protect users. The Washington Post. May 23. A1; A12.
Earle, G. (2016). Google’s extraordinary access to Obama revealed as White House visitor logs show 427 meetings between company and administration officials. Daily Mail. May 17, 2016. http://www.dailymail.co.uk/news/article-3595166/Google-s-extraordinary-access-revealed-White-House-visitor-logs-meeting-meeting-company-execs-Obama-administration-officials.html#ixzz492RElb9J. Accessed 17 May 2016.
Ginsburg, J., et al. (2009). Detecting influenza epidemics using search engine query data. Nature, 457, 1012–1014.
Goldberg, R. (2016). Lack of trust in internet privacy and security may deter economic and other online activities. May 13, 2016. https://www.ntia.doc.gov/blog/2016/lack-trust-internet-privacy-and-security-may-deter-economic-and-other-online-activities. Accessed 17 May 2016.
Harzog, W. (2012). Chain-link confidentiality. Georgia Law Review, 46, 657–704.
Howard, P. (2015). Pax technica: How the internet of things may set us free or lock us up. New Haven: Yale University Press.
Kord, D., & Patterson, D. (2012). Ethics of Big Data: Balancing risk and innovation. Sebastopol, CA: O’Reilly.
Madrigal, A. (2012). If it was your job to read privacy policies for 8 h per day, it would take you 76 work days to complete the task. http://www.theatlantic.com/technology/archive/2012/03/reading-the-privacy-policies-you-encounter-in-a-year-would-take-76-work-days/253851/. Accessed 14 Jan 2016.
Mayer-Schonberger, V., & Cukier, K. (2013). Big Data: A revolution that will transform how we live, work, and think. New York: Houghton Mifflin Harcourt.
Ohm, P. (2010). Broken promises of privacy: Responding to the surprising failure of anonymization. UCLA Law Review, 57, 1702–1777.
Richards, N. (2014). Intellectual privacy: Rethinking civil liberties in the digital age. New York: Oxford University Press.
Schneier, B. (2015). Data and Goliath: The hidden battles to capture your data and control your world. New York: W.W. Norton.
Solove, D. J. (2013). Privacy self-management and the consent paradox. Harvard Law Review, 126, 1880–1903.
Strauss, V. (2015). Report: Big education firms spend millions lobbying for pro-testing policies. The Washington Post. March 30, 2015. https://www.washingtonpost.com/news/answer-sheet/wp/2015/03/30/report-big-education-firms-spend-millions-lobbying-for-pro-testing-policies/. Accessed 17 May 2016.
Thomson, J. J. (1975). The right to privacy. Philosophy and Public Affairs, 4, 295–314.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Pascalev, M. Privacy exchanges: restoring consent in privacy self-management. Ethics Inf Technol 19, 39–48 (2017). https://doi.org/10.1007/s10676-016-9410-4
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10676-016-9410-4