Skip to main content
Log in

Successful failure: what Foucault can teach us about privacy self-management in a world of Facebook and big data

  • Original Paper
  • Published:
Ethics and Information Technology Aims and scope Submit manuscript

Abstract

The “privacy paradox” refers to the discrepancy between the concern individuals express for their privacy and the apparently low value they actually assign to it when they readily trade personal information for low-value goods online. In this paper, I argue that the privacy paradox masks a more important paradox: the self-management model of privacy embedded in notice-and-consent pages on websites and other, analogous practices can be readily shown to underprotect privacy, even in the economic terms favored by its advocates. The real question, then, is why privacy self-management occupies such a prominent position in privacy law and regulation. Borrowing from Foucault’s late writings, I argue that this failure to protect privacy is also a success in ethical subject formation, as it actively pushes privacy norms and practices in a neoliberal direction. In other words, privacy self-management isn’t about protecting people’s privacy; it’s about inculcating the idea that privacy is an individual, commodified good that can be traded for other market goods. Along the way, the self-management regime forces privacy into the market, obstructs the functioning of other, more social, understandings of privacy, and occludes the various ways that individuals attempt to resist adopting the market-based view of themselves and their privacy. Throughout, I use the analytics practices of Facebook and social networking sites as a sustained case study of the point.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Foucault’s work occupies a strange place in the study of privacy. On the one hand, his discussion of panopticism in Discipline and Punish (1977) arguably provided the organizing metaphor for an entire literature around surveillance, even if there has been a move over the last decade to Deleuze (see, e.g., Haggerty 2006; Haggerty and Ericson 2000; Lyon 2006). On the other hand, if one considers “privacy” as an ethical norm or legal right, Foucault is nearly entirely absent from the discussion (notable exceptions are Boyle 1997; Cohen 2013; Reiman 1995). This is no doubt due, in part, to the general treatment of privacy as a question of information disclosure by individuals (which makes the more sociological analysis of panopticism seem less immediately relevant), and to the subsequent adoption of a theory of decisional autonomy that Foucault rejected. For both of these, see the complaint in Cohen (2012a). What is lost in the reduction of Foucault to panopticism and privacy to formal autonomy is Foucault’s post Discipline and Punish work on techniques of governance, biopolitics, and the formation of ethical subjects. As a whole, this body of work treats the ways that various social practices contribute to a process of “subjection” (or “subjectification”), and, in so doing, how they help to make us who we are. There is considerable discussion about the compatibility of these phases of Foucault’s work with each other. In particular, many Foucault scholars e.g., (McNay 2009) think—generally to their frustration – that the later work on ethics is incompatible with, or at least on a completely different footing from, the work on biopolitics and governmentality. Foucault himself famously denied this charge (“it is not power, but the subject, which is the general theme of my research” for the past 20 years (1982, pp. 208–209). For a supportive assessment of Foucault on the point, see Lazzarato (2000). It is also possible that Foucault’s understanding of biopower changes between his introduction of the subject in 1976 and his later usage of it (for exemplary treatments, see Collier 2009; Protevi 2010). I will not attempt to resolve either debate here; for the sake of this paper, I will assert but not defend the view that the ethical “techniques of the self” open a path for studying the ways that biopolitics secures its own operation in the individual persons who use these techniques. Foucault’s studies of ancient Greek strategies for subjection, then, can be understood as models for studying the techniques in current society. Prima facie plausibility of this view comes from (a) Foucault’s own assertions (see above); (b) the extent to which the disciplinary techniques featured in Discipline and Punish are precisely about convincing individuals how they should view themselves as subjects of power; and (c) Foucault’s emphasis in his discussion of American neoliberalism (Foucault 2008) on the attempt to reconfigure subjectivity along entrepreneurial lines. For discussion of this last point, see, e.g., (Hamann 2009).

  2. I draw the term “privacy self-management” from (Solove 2013). Privacy self-management is essentially the current version of privacy as “control” over personal information, as found in earlier sources such as Westin and (Fried 1968). Early version of the theory included significant attention to sociological literature that described privacy in terms of social group formation. For the ways that this attention diminished, especially in the construction of the argument in Westin, see (Steeves 2009). Complaints about the ubiquity of privacy self-management and its failures are common; in addition to Solove, see, e.g., (Cohen 2012a, 2013; Nissenbaum 2010).

    The situation in Europe is, at least on its face, quite different: the EU Data Protection Directive encodes a number of substantive privacy norms, which result in sector-specific differences from U.S. law [for a comparative discussion of healthcare, for example, see (Francis 2014–2015)]. The Directive is also currently undergoing an upgrade designed to update and strengthen the Directive into a Regulation (General Data Protection Regulation, GDPR). There is considerable skepticism, however, as to whether this process will succeed. Bert-Japp Koops (2014), for example—citing some of the same sources discussed here, such as Solove (2013)—argues that the proposed regulation relies too much on consent and its underlying value of autonomy. He also notes that the GDPR in its current version is extraordinarily complex, which reduces the likelihood that it will achieve effective protection, especially insofar as the complexity becomes a barrier to companies viewing privacy as a social value, rather than a compliance-based hoop a similar complaint is made in (Blume 2014). Thus, although considerations of space preclude the extension of the discussion into the details of EU law, it seems plausible that at some of the same problems are present there as well.

  3. Although I will not pursue the point in detail, these problems are also problems for technical solutions that depend on predelegating privacy decisions to some sort of data vault or other software agent that encodes users’ data and their privacy preferences and then attempts to automatically negotiate with websites and other service providers. Assuming that websites would comply with such an approach on the part of users—and this seems like an assumption that needs independent justification since, as I will note, companies like Facebook clearly make it difficult to protect one’s privacy on purpose—software agents could at most help with the difficulty in effectuating privacy preferences. The information asymmetries and uncertainty surrounding privacy decisions cannot be relieved by having a software agent, and setting the agent to refuse to disclose information may still carry unacceptable costs for users. Further, the use of agents only reinforces the idea that privacy is an alienable market good, entrenching the consent mindset.

  4. This is not a new concern: see, for example, (Tavani 1998).

  5. GirlsARoundMe.com, visited 6/2014. There is a periodic reference to “guys” in the site’s front page, but it mainly proves its own exceptional status: the pictures are all of women.

  6. For behavioral economics and privacy, see particularly the work of Allesandro Acquisti, e.g., (Acquisti 2009; Grossklags and Acquisti 2007).

  7. Defaults matter. Research indicates that most individuals do not change software or other defaults (as, for example, 401(k) participation, which can be raised dramatically by simply switching from an opt-into an opt-out default). The reasons for this are partly economic—changing defaults (especially on Facebook) takes time and effort, and partly normalizing: the default setting communicates what an “average” or “reasonable” user ought to prefer. See (Shah and Kesan 2007).

  8. One study found that nearly a quarter of respondents regretted mistaken oversharing on Facebook, reporting loss of important relationships and employment (Wang et al. 2011). In an earlier paper, two colleagues in human–computer interaction and I made the case that FB’s privacy problems are design-related: see (Hull et al. 2011). For more on the HCI implications of privacy, see, for example, (Dourish and Anderson 2006).

  9. For a theoretical development of this concern, see (Schermer et al. 2014).

  10. For more evidence of this point—that social norms and social factors like trust among group members—influence disclosure behavior in the context of SNS, see, e.g., (Nov and Wattal 2009). One ethnographic study suggests that users care about social privacy (both engaging in a number of privacy-protective behaviors but also using lax privacy settings to engage in some social surveillance) but not about what Facebook as a company does with their information (Raynes-Goldie 2010). Users also appear to carry social norms from one context into a new one that they perceive to be analogous (Martin 2012).

  11. On this point, see also (Reiman 1995), pointing out that privacy is therefore required for the development of the sort of subject who is able to rationally assess and trade away her privacy.

  12. Cohen’s claim about innovation—which flies in the face of orthodoxy—has not gone unchallenged. See,e.g., (Strahilevitz 2013, p. 2040 n125).

  13. See, e.g., (Amoore 2004) (on the individualization of risk in the workplace); (Binkley 2009) (parenting guides); (Cooper 2012) (increasing contingency of paid work); (Ericson et al. 2000) (disaggregation in insurance); (Feher 2009) (centrality of human capital and notions of entrepreneurship); (Lazzarato 2009) (importance of financialization); (Reddy 1996) (role of expert knowledge); and (Simon 2002) (rise of extreme sports as emblematic). For a very accessible general discussion, see (Brown 2005).

  14. For social networking, see the discussion and cites above. For social networking and robotics, see also (Turkle 2011). For violent video games, see (McCormick 2001; Waddington 2007; Wonderly 2008). I advance the thesis in the context of library filtering programs (2009) and digital rights management (Hull, 2012). It is important to note that one does not have to be a reader of Foucault to arrive at this hypothesis: for the “extended mind” hypothesis and its application to technological environments, see (Clark 2003), and for an argument motivated very much by classical liberalism, see (Benkler 2006).

  15. On this, see, for example (Binder et al. 2009) [finding that “social space provided by SNS typically lacks boundaries and segmentation that are characteristics of offline networks. Boundaries between social spheres occur naturally in offline networks, mostly due to spatial/temporal separation of contacts. This elaborate structure is dropped in online environments” (966)];.

  16. It is true that users can be nudged to more privacy-protective behavior, but when undertaken in the therapeutic terms of behavioral economics, this nudging serves to even further entrench the framing of privacy as a problem for economic rationality.

  17. For the argument that the EU GDPR creates similar myths—both that privacy is more protected than it is, and that subjects are more empowered than they are—see (Blume 2014; Koops 2014).

  18. Koops suggests that the proposed EU regulations are stuck in a similar binarism: “EU data protection law applies an all-or-nothing approach: data is either personal data (triggering the whole regime), or it is not (triggering nothing), but it cannot be something in between or something else” (2014, 257).

References

  • Acquisti, A. (2009). Nudging privacy: The behavioral economics of personal information. Security & Privacy, IEEE, 7(6), 82–85. doi:10.1109/MSP.2009.163.

    Article  Google Scholar 

  • Albergotti, R. (2014, June 12). Facebook to target ads based on web browsing. Wall Street Journal.

  • Allen, A. L. (2011). Unpopular privacy: What must we hide?. Oxford; New York, N.Y.: Oxford University Press.

    Book  Google Scholar 

  • Amoore, L. (2004). Risk, reward and discipline at work. Economy and Society, 33(2), 174–196. doi:10.1080/03085140410001677111.

    Article  Google Scholar 

  • Backstrom, L., & Kleinberg, J. (2014). Romantic partnerships and the dispersion of social ties: A network analysis of relationship status on facebook. Paper presented at the Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing, Baltimore, Maryland, USA.

  • Barkhuus, L. (2012). The mismeasurement of privacy: Using contextual integrity to reconsider privacy in HCI. Proceedings of CHI2012 Austin.

  • Barkhuus, L., & Tashiro, J. (2010). Student socialization in the age of facebook. Paper presented at the Proceedings of the 28th international conference on Human factors in computing systems, Atlanta, Georgia, USA.

  • Bazarova, N. N. (2012). Contents and contexts: Disclosure perceptions on facebook. Paper presented at the Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work, Seattle, Washington, USA.

  • Benkler, Y. (2006). The wealth of networks: How social production transforms markets and freedom. New Haven [Conn.]: Yale University Press.

    Google Scholar 

  • Binder, J., Howes, A., & Sutcliffe, A. (2009). The problem of conflicting social spheres: Effects of network structure on experienced tension in social network sites. Paper presented at the Proceedings of the 27th international conference on Human factors in computing systems, Boston, MA, USA.

  • Binkley, S. (2009). The work of neoliberal governmentality: Temporality and ethical substance in the tale of two dads. Foucault Studies, 6, 60–78.

    Google Scholar 

  • Blume, P. (2014). The myths pertaining to the proposed general data protection regulation. International Data Privacy Law, 4(4), 269–273. doi:10.1093/idpl/ipu017.

    Article  MathSciNet  Google Scholar 

  • Boyd, D. (2007). Why youth (heart) social network sites: The role of networked publics in teenage social life. In D. Buckingham (Ed.), MacArthur foundation series on digital learning—Youth, identity, and digital media volume. Cambridge, M. A.: MIT Press.

    Google Scholar 

  • Boyd, D. (2008). Facebook’s privacy trainwreck: Exposure invasion and social convergence. Convergence, 14(1), 13–20.

    Google Scholar 

  • Boyd, D. (2014). What does the Facebook experiment teach us? Retrieved from http://www.zephoria.org/thoughts/archives/2014/07/01/facebook-experiment.html.

  • Boyd, D., & Crawford, K. (2012). Critical questions for big data. Information, Communication & Society, 15(5), 662–679. doi:10.1080/1369118x.2012.678878.

    Article  Google Scholar 

  • Boyle, J. (1997). Shamans, software and spleens: Law and the construction of the information society. Cambridge, MA: Harvard University Press.

    Google Scholar 

  • Brown, W. (2005). Neoliberalism and the end of liberal democracy edgework: Critical essays on knowledge and politics (pp. 37–59). Princeton: Princeton University Press.

    Google Scholar 

  • Christofides, E., Muise, A., & Desmarais, S. (2009). Information disclosure and control on Facebook: Are they two sides of the same coin or two different processes? CyberPsychology & Behavior, 12(3), 341–345. doi:10.1089/cpb.2008.0226.

    Article  Google Scholar 

  • Clark, A. (2003). Natural-born cyborgs: Minds, technologies, and the future of human intelligence. Oxford; New York: Oxford University Press.

    Google Scholar 

  • Cohen, J. E. (2012a). Configuring the networked self: Law, code, and the play of everyday practice. New Haven [Conn.]: Yale University Press.

    Google Scholar 

  • Cohen, J. E. (2012b). Irrational privacy? Journal of Telecommunications and High Technology Law, 10, 241–249.

    Google Scholar 

  • Cohen, J. E. (2013). What privacy is for. Harvard Law Review, 126, 1904–1933.

    Google Scholar 

  • Collier, S. J. (2009). Topologies of power: Foucault’s analysis of political government beyond ‘governmentality’. Theory, Culture & Society, 26, 78–108.

    Article  Google Scholar 

  • Cooper, M. (2012). Workfare, familyfare, godfare: Transforming contingency into necessity. South Atlantic Quarterly, 111(4), 643–661. doi:10.1215/00382876-1724120.

    Article  Google Scholar 

  • Dourish, P., & Anderson, K. (2006). Collective information practice: Emploring privacy and security as social and cultural phenomena. Human-Computer Interaction, 21(3), 319–342. doi:10.1207/s15327051hci2103_2.

    Article  Google Scholar 

  • Duhigg, C. (2009). What does your credit-card company know about you?. USA: New York Times Magazine.

    Google Scholar 

  • Ellison, N. B., Steinfield, C., & Lampe, C. (2007). The benefits of Facebook “friends:” Social capital and college students’ use of online social network sites. Journal of Computer-Mediated Communication, 12(4), 1143–1168. doi:10.1111/j.1083-6101.2007.00367.x.

    Article  Google Scholar 

  • Ericson, R., Barry, D., & Doyle, A. (2000). The moral hazards of neo-liberalism: Lessons from the private insurance industry. Economy and Society, 29(4), 532–558. doi:10.1080/03085140050174778.

    Article  Google Scholar 

  • Feher, M. (2009). Self-appreciation; or, the aspirations of human capital. Public Culture, 21(1), 21–41. doi:10.1215/08992363-2008-019.

    Article  Google Scholar 

  • Foucault, M. (1977). Discipline and punish: The birth of the prison. New York: Pantheon Books.

    Google Scholar 

  • Foucault, M. (1982). The subject and power. In H. L. Dreyfus & P. Rabinow (Eds.), Michel Foucault: Beyond structuralism and hermeneutics (pp. 208–226). Chicago: University of Chicago Press.

    Google Scholar 

  • Foucault, M. (1985). The use of pleasure (trans: Hurley, R). New York: Vintage Books.

  • Foucault, M. (2008). The birth of biopolitics: Lectures at the Collège de France, 1978–79 (trans: Burchell, G). New York: Palgrave Macmillan.

  • Francis, L. (2014–2015). Privacy and health information: The United States and the European Union. Kentucky Law Journal, 103, 419–431.

  • Fried, C. (1968). Privacy. Yale Law Journal, 77(3), 475–493.

    Article  Google Scholar 

  • Gross, R., & Acquisti, A. (2005). Information revelation and privacy in online social networks. Paper presented at the Proceedings of the 2005 ACM workshop on Privacy in the electronic society, Alexandria, VA, USA.

  • Grossklags, J., & Acquisti, A. (2007). What can behavioral economics teach us about privacy? In A. Acquisti, S. Gritzalis, C. Lambrinoudakis, & S. D. C. di Vimercati (Eds.), Digital privacy (pp. 363–377). Boca Raton: Auerbach Publications.

    Chapter  Google Scholar 

  • Guthrie, K., & Sokolowsky, J. (2012). Obesity and credit risk.

  • Haggerty, K. D. (2006). Tear down the walls: On demolishing the panopticon. In D. Lyon (Ed.), Theorizing surveillance: The panopticon and beyond (pp. 23–45). Cullompton, Devon: Willan Pub.

    Google Scholar 

  • Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. The British Journal of Sociology, 51(4), 605–622. doi:10.1080/00071310020015280.

    Article  Google Scholar 

  • Hamann, T. H. (2009). Neoliberalism, governmentality, and ethics. Foucault Studies, 6, 37–59.

    Google Scholar 

  • Harcourt, B. E. (2014). Governing, exchanging, securing: Big data and the production of digital knowledge. Paper presented at the Big data, entreprises et sciences sociales—Usages et partages des données numériques de masse, Paris.

  • Hayles, N. K. (1999). How we became posthuman: Virtual bodies in cybernetics, literature, and informatics. Chicago, Ill.: University of Chicago Press.

    Book  Google Scholar 

  • Hoofnagle, C. J., & Whittington, J. (2014). Free: Accounting for the costs of the internet’s most popular price. UCLA Law Review, 61, 606–670.

    Google Scholar 

  • Hull, G., Lipford, H. R., & Latulipe, C. (2011). Contextual gaps: Privacy issues on Facebook. Ethics and information technology, 13(4), 289–302.

  • Hull, G. (2012). Coding the dictatorship of ‘the They:' A phenomenological critique of digital rights management. In M. Sanders & J. J. Wisnewski (Eds.), Ethics and phenomenology (pp. 197–220).

  • Jernigan, C., & Mistree, B. F. T. (2009). Gaydar: Facebook friendships expose sexual orientation. First Monday, 14(10).

  • Jones, H., & Soltren, J. H. (2005). Facebook: Threats to privacy. Retrieved from http://groups.csail.mit.edu/mac/classes/6.805/student-papers/fall05-papers/facebook.pdf.

  • King, J., Lampinen, A., & Smolen, A. (2011). Privacy: Is there an app for that? Paper presented at the Proceedings of the Seventh Symposium on Usable Privacy and Security, Pittsburgh, Pennsylvania.

  • Koops, B.-J. (2014). The trouble with European data protection law. International Data Privacy Law, 4(4), 250–261. doi:10.1093/idpl/ipu023.

    Article  Google Scholar 

  • Kosinski, M., Stillwell, D., & Graepel, T. (2013). Private traits and attributes are predictable from digital records of human behavior. Proceedings of the National Academy of Sciences,. doi:10.1073/pnas.1218772110.

    Google Scholar 

  • Kramer, A. D. I., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788–8790. doi:10.1073/pnas.1320040111.

    Article  Google Scholar 

  • Lampe, C., Ellison, N. B., & Steinfield, C. (2008). Changes in use and perception of facebook. Paper presented at the Proceedings of the 2008 ACM conference on computer supported cooperative work, San Diego, CA, USA.

  • Lazzarato, M. (2000). Du biopouvoir à la biopolitique. Multitudes, 1(1), 45–57.

    Article  Google Scholar 

  • Lazzarato, M. (2009). Neoliberalism in action: Inequality, insecurity and the reconstitution of the social. Theory, Culture & Society, 26(6), 109–133. doi:10.1177/0263276409350283.

    Article  Google Scholar 

  • Liu, Y., Gummadi, K. P., Krishnamurthy, B., & Mislove, A. (2011). Analyzing facebook privacy settings: User expectations vs. reality. Paper presented at the Proceedings of the 2011 ACM SIGCOMM conference on internet measurement conference, Berlin, Germany.

  • Lyon, D. (Ed.). (2006). Theorizing surveillance: The panopticon and beyond. Cullompton, Devon: Willan Pub.

    Google Scholar 

  • Madejski, M., Johnson, M., & Bellovin, S. E. (2011). The failure of online social network privacy setetings Columbia University Computer Science Technical Reports.

  • Martin, K. (2012). Information technology and privacy: Conceptual muddles or privacy vacuums? Ethics and Information Technology, 14(4), 267–284. doi:10.1007/s10676-012-9300-3.

    Article  Google Scholar 

  • Marwick, A. E., & Boyd, D. (2014). Networked privacy: How teenagers negotiate context in social media. New Media & Society,. doi:10.1177/1461444814543995.

    Google Scholar 

  • McCormick, M. (2001). Is it wrong to play violent video games? Ethics and Information Technology, 3(4), 277–287. doi:10.1023/a:1013802119431.

    Article  MathSciNet  Google Scholar 

  • McDonald, A. M., & Cranor, L. F. (2008). The cost of reading privacy policies. I/S. A Journal of Law and Policy for the Information Society, 4(3), 540–565.

    Google Scholar 

  • McDonald, A. M., & Cranor, L. (2010). Americans’ attitudes about internet behavioral advertising practices. Paper presented at the Proceedings of the 9th annual ACM workshop on Privacy in the electronic society, Chicago, Illinois, USA.

  • McMahon, J. (2015). Behavioral economics as neoliberalism: Producing and governing homo economicus. Contemporary Political Theory, 14, 137–158.

    Article  Google Scholar 

  • McNay, L. (2009). Self as enterprise: Dilemmas of control and resistance in Foucault’s the birth of biopolitics. Theory, Culture & Society, 26(6), 55–77.

    Article  Google Scholar 

  • Nissenbaum, H. (2010). Privacy in context: Technology, policy, and the integrity of social life. Palo Alto: Stanford University Press.

    Google Scholar 

  • Nov, O., & Wattal, S. (2009). Social computing privacy concerns: Antecedents and effects. Paper presented at the Proceedings of the 27th international conference on Human factors in computing systems, Boston, MA, USA.

  • Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Cambridge, MA: Harvard University Press.

    Book  Google Scholar 

  • Peppet, S. R. (2011). Unraveling privacy: The personal prospectus and the threat of a full disclosure future. Northwestern Law Review, 105(3), 1153–1204.

    Google Scholar 

  • Protevi, J. (2010). What does Foucault think is new about neo-liberalism?. Pli: Warwick Journal of Philosophy, 21.

  • Raynes-Goldie, K. (2010). Aliases, creeping, and wall cleaning: Understanding privacy in the age of Facebook. First Monday, 15(1).

  • Reddy, S. G. (1996). Claims to expert knowledge and the subversion of democracy: The triumph of risk over uncertainty. Economy and Society, 25(2), 222–254. doi:10.1080/03085149600000011.

    Article  Google Scholar 

  • Reiman, J. H. (1995). Driving to the panopticon: A philosophical exploration of the risks to privacy posed by the highway technology of the future. Santa Clara High Technology Law Journal, 11(1), 27–44.

    Google Scholar 

  • Schermer, B., Custers, B., & van der Hof, S. (2014). The crisis of consent: How stronger legal protection may lead to weaker consent in data protection. Ethics and Information Technology, 16(2), 171–182. doi:10.1007/s10676-014-9343-8.

    Google Scholar 

  • Semaan, B., & Mark, G. (2012). ‘Facebooking’ towards crisis recovery and beyond: Disruption as an opportunity. Paper presented at the Proceedings of the ACM 2012 conference on computer supported cooperative work, Seattle, Washington, USA.

  • Shah, R. C., & Kesan, J. P. (2007). Governing with information technologies. Paper presented at the Proceedings of the 8th annual international conference on Digital government research: Bridging disciplines \& domains, Philadelphia, Pennsylvania.

  • Simon, J. (2002). Taking risks: Extreme sports and the embrace of risk in advanced liberal societies. In T. Baker & J. Simon (Eds.), Embracing risk: The changing culture of insurance and responsibility (pp. 177–208). Chicago: University of Chicago Press.

    Google Scholar 

  • Solove, D. J. (2007). ’I’ve got nothing to hide’ and other misunderstandings of privacy. San Diego Law Review, 44, 745–772.

    Google Scholar 

  • Solove, D. J. (2013). Privacy Self-management and the consent dilemma. Harvard Law Review, 126, 1880–1903.

    Google Scholar 

  • Steeves, V. (2009). Reclaiming the social value of privacy. In I. Kerr, C. Lucock, & V. Steeves (Eds.), Lessons from the identity trail: Anonymity, privacy and identity in a networked society (pp. 191–208). Oxford: Oxford University Press.

    Google Scholar 

  • Strahilevitz, L. J. (2013). Toward a positive theory of privacy law. Harvard Law Review, 126, 2010–2042.

    Google Scholar 

  • Strandburg, K. J. (2013). Free fall: The online market’s consumer preference disconnect. University of Chicago Legal Forum, 2013, 95–172.

    Google Scholar 

  • Tavani, H. T. (1998). Informational privacy, data mining, and the internet. Ethics and Information Technology, 1(2), 137–145. doi:10.1023/a:1010063528863.

    Article  Google Scholar 

  • Tufekci, Z. (2014). Engineering the public: Big data, surveillance and computational politics. First Monday, 19(7).

  • Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other.

  • Waddington, D. (2007). Locating the wrongness in ultra-violent video games. Ethics and Information Technology, 9(2), 121–128. doi:10.1007/s10676-006-9126-y.

    Article  MathSciNet  Google Scholar 

  • Wang, Y., Norcie, G., Komanduri, S., Acquisti, A., Leon, P. G., & Cranor, L. F. (2011). “I regretted the minute I pressed share”: A qualitative study of regrets on Facebook. Paper presented at the Proceedings of the Seventh Symposium on Usable Privacy and Security, Pittsburgh, Pennsylvania.

  • Wonderly, M. (2008). A Humean approach to assessing the moral significance of ultra-violent video games. Ethics and Information Technology, 10(1), 1–10. doi:10.1007/s10676-007-9149-z.

    Article  Google Scholar 

  • Young, A. L., & Quan-Haase, A. (2013). Privacy protection strategies on facebook. Information, Communication & Society, 16(4), 479–500. doi:10.1080/1369118X.2013.777757.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gordon Hull.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hull, G. Successful failure: what Foucault can teach us about privacy self-management in a world of Facebook and big data. Ethics Inf Technol 17, 89–101 (2015). https://doi.org/10.1007/s10676-015-9363-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10676-015-9363-z

Keywords

Navigation