Skip to main content
Log in

Privacy exchanges: restoring consent in privacy self-management

  • Original Paper
  • Published:
Ethics and Information Technology Aims and scope Submit manuscript

Abstract

This article reviews the qualitative changes that big data technology introduced to society, particularly changes that affect how individuals control the access, use and retention of their personal data. In particular interest is whether the practice of privacy self-management in this new context could still ensure the informed consent of individuals to the privacy terms of big data companies. It is concluded that that accepting big data companies’ privacy policies falls short of the disclosure and understanding requirements for informed consent. The article argues that the practice of privacy self-management could remain viable if the review, understanding and acceptance of privacy agreements is streamlined, standardized and automated. Technology should be employed to counter the privacy problems created by big data technology. The creation of the privacy exchange authorities (PEA) is proposed as a solution to the failures of privacy self-management. The PEA are intermediaries that empower individuals to define their own privacy terms and express informed consent in their dealings with data companies. They will create the technological infrastructure for individuals to select their own privacy terms from a list of standard choices, potentially only once. The PEA will further mediate the delivery and authentication of the individual users’ privacy terms to data companies. A logical proof of concept is offered, illustrating the potential steps involved in the creation of the PEA.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. A recent survey of the National Telecommunications & Information Administration indicates that “Americans are increasingly concerned about online security and privacy at a time when data breaches, cybersecurity incidents, and controversies over the privacy of online services have become more prominent. These concerns are prompting some Americans to limit their online activity” (Goldberg 2016).

  2. The term “logical” here is used in the information system sense of logical (high level process definition) versus physical (low level technical implementation).

  3. Barocas and Nissenbaum define big data not as a technology, method, or practice, but rather, as “a paradigm… a way of thinking about knowledge through data and a framework for supporting decision making, rationalizing action, and guiding practice” (2014, p. 46).

  4. As Mayer-Schonberger and Cukier put it, “Before big data, our analysis was usually limited to testing a small number of hypotheses that we defined well before we even collected the data. When we let the data speak, we can make connections that we had never thought existed” (2013, p. 14). Consider the example of data scientists at Google developing a model correlating the search frequency of 45 terms and the spread of flu epidemics (Ginsburg et al. 2009).

  5. See also Ohm discussing reidentification (2010, pp. 1703–1704).

  6. Philip Howard argues that a new world order is emerging, a “pact between big technology firms and government…. The pax technica is a political, economic, and cultural arrangement of institutions and networked devices in which government and industry are tightly bound in mutual defense pacts, design collaborations, standards setting, and data mining” (2015, pp. 19–20).

  7. This case provides a great tool for the analysis of controversial big data practices, such as Google’s scanning of personal emails to determine what advertisements to display to the user. Consider a third case, where the spy trains a listening device on your house, and translates the conversation not into spoken or written language but a series of images and short films, which are in fact advertisements related to the content of your conversation. It appears that the third spy also violates your privacy, and all three cases are identical from an ethical point of view. Displaying diaper advertisements to a secretly pregnant teenage girl could out her secret. Enough observations of targeted advertisements may enable a degree of reverse translation, or inference of the original messages.

  8. Barocas and Nissenbaum also conclude that valid informed consent is impossible in the context of big data. They identify what they call the transparency paradox. Complex policy terms are impossible to comprehend, but simplified terms are inadequate to inform. “For individuals to make considered decisions about privacy in this environment, they need to be informed about the types of information being collected, with whom it is shared, under what constraints, and for what purposes…. Simplified, plain-language notices cannot provide information that people need to make such decisions” (2014, pp. 58–59).

  9. According to Alexis Madrigal, “If it was your job to read privacy policies for 8 h per day, it would take you 76 work days to complete the task” (2012).

  10. Strauss reports about multi-million dollar lobbying efforts by Pearson and others to shape public opinion, promote pro-high stakes testing and resist student privacy legislations (2015).

  11. Woodrow Harzog offers to build chain-link confidentiality into contracts in order to counteract downstream privacy violations. “To create the chain of protection, contracts would be used to link each new recipient of information to a previous recipient who wished to disclose the information” (2012, p. 683).

  12. Daniel Solove reaches similar conclusion. According to him, “privacy self-management faces several problems that together demonstrate that this paradigm alone cannot serve as the centerpiece of a viable privacy regulatory regime… cognitive problems, which concern challenges caused by the way humans make decisions, and… structural problems, which concern challenges arising from how privacy decisions are designed” (2013, p. 1883).

  13. See Schneier (2015), p. 132.

  14. This method of integrating third party authentication interface is popular and is used, for instance, by PayPal to initiate a secure payment for eBay purchases without exposing the user’s payment method to the vendor. DocuSign also uses the same method to store users’ digital signature.

  15. This is a non-technical explanation of public key cryptography. It “uses two keys—a public key known to everyone and a private or secret key known only to the recipient of the message. When John wants to send a secure message to Jane, he uses Jane's public key to encrypt the message. Jane then uses her private key to decrypt it (Beal 2016).

  16. Elizabeth Dwoskin reports a shift away from excessive data collection among new technology companies. She quotes Larry Gadea, founder and chief executive of Envoy, which develops visitor registration software: “We have to keep as little [information] as possible so that even if the government or some other entity wanted access to it, we’d be able to say we don’t have it” (2016). This trend intensified in the wake of the FBI's dispute with Apple, and the strong pro-customer privacy security stance taken by Apple. See Cook (2016).

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mario Pascalev.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pascalev, M. Privacy exchanges: restoring consent in privacy self-management. Ethics Inf Technol 19, 39–48 (2017). https://doi.org/10.1007/s10676-016-9410-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10676-016-9410-4

Keywords

Navigation