Abstract
Within a given conversation or information exchange, do privacy expectations change based on the technology used? Firms regularly require users, customers, and employees to shift existing relationships onto new information technology, yet little is known as about how technology impacts established privacy expectations and norms. Coworkers are asked to use new information technology, users of gmail are asked to use GoogleBuzz, patients and doctors are asked to record health records online, etc. Understanding how privacy expectations change, if at all, and the mechanisms by which such a variance is produced will help organizations make such transitions. This paper examines whether and how privacy expectations change based on the technological platform of an information exchange. The results suggest that privacy expectations are significantly distinct when the information exchange is located on a novel technology as compared to a more established technology. Furthermore, this difference is best explained when modeled by a shift in privacy expectations rather than fully technology-specific privacy norms. These results suggest that privacy expectations online are connected to privacy offline with a different base privacy expectation. Surprisingly, out of the five locations tested, respondents consistently assign information on email the greatest privacy protection. In addition, while undergraduate students differ from non-undergraduates when assessing a social networking site, no difference is found when judging an exchange on email. In sum, the findings suggest that novel technology may introduce temporary conceptual muddles rather than permanent privacy vacuums. The results reported here challenge conventional views about how privacy expectations differ online versus offline. Traditionally, management scholarship examines privacy online or with a specific new technology platform in isolation and without reference to the same information exchange offline. However, in the present study, individuals appear to have a shift in their privacy expectations but retain similar factors and their relative importance—the privacy equation by which they form judgments—across technologies. These findings suggest that privacy scholarship should make use of existing privacy norms within contexts when analyzing and studying privacy in a new technological platform.
Similar content being viewed by others
Notes
This negotiation over privacy norms is not synonymous with privacy as a commodity (Smith et al. 2011), a privacy calculus (Culnan and Armstrong 1999; Dinev and Hart 2006), or a second exchange (Culnan and Bies 2003), all of which assume individuals relinquish privacy in order to gain something in return. In other words, individuals are seen as giving up some measure of privacy to benefit from a transaction (e.g., customizing products or using electronic health records or having books suggested online). In this paper, the negotiation is over the privacy norm function; actors within a context negotiate what the privacy rules will be while retaining every expectation of privacy.
In comparison, in experiments, factors are designed orthogonal to each other but manipulated one at a time; however, in a traditional survey, many factors are examined but are not necessarily orthogonal to each other (Appelbaum et al. 2006).
Individuals regularly give access to information to people or organizations while keeping the same information from others. Alternatively, the restricted access version of privacy—where information that is inaccessible is private and that which is accessible is public –supports a dichotomy where information can be universally declared ‘public’ or ‘private.’ Many find the distinction to be false (Solove 2006; Nissenbaum 2004, 2009; Tufekci 2008) and “the idea of two distinct spheres, of the ‘public’ and the ‘private’ is in many ways an outdated concept” (Marwick et al. 2010). Or, as Nissenbaum states, “the crucial issue is not whether the information is private or public, gathered from private or public settings, but whether the action breaches contextual integrity” (2004, p. 134).
Respondent fatigue was a factor for some respondent groups. Two dummy variables were created to signify vignette ratings with a sequence number over 30 and over 20. If the ordinal regression model demonstrated a significant impact on the rating task by either dummy variable, those associated vignette ratings were discarded for that model. The regression was rerun without the offending data. However, a larger design issue came from the respondents’ learning curve—presumably from the novelty of the survey design. Once the first two vignette ratings for each respondent (sequence numbers 1 and 2) were discarded for all respondents, the model fit criteria and parallel lines assumptions improved dramatically. all vignette ratings were discarded with a sequence number of 1 or 2 for the entire analysis.
The rating task in the survey was a five-level ordinal scale (0–4) as is shown in the “Appendix”, however the distribution of the ratings was not normal around the mean. The top three levels (0–2) were combined to create a new scale with three levels coded 1–3. The user was not, however, given an option to answer “I don’t know” or “I need more information.” The user could skip a vignette and continue.
In order to examine respondent level factors with OLS regression equations for each respondent, a minimum number of ratings per respondents was required. Therefore, all respondents were dropped who answered less than 20 vignettes and removed their ratings from the larger data set of vignette ratings. Therefore, the number of respondents used in this study is 471 (rather than 811) and the total number of vignette rated is 15,108 (rather than 21,187).
Each respondent equation is based on between 20 and 40 rated vignettes (N = number of vignettes). Each OLS regression equation was performed using clustered and unclustered regression with no significant difference. In addition, neither the undergraduate status or gender were statistically significant determinants of the number of vignettes rated.
Ideally, experience with each technology platform would also be used here, however, experience with email or Facebook was not collected in the survey. Undergraduate status may be an imprecise proxy for experience with Facebook.
References
Acquisti, A., & Gross, R. (2006). Imagined communities: Awareness, information sharing, and privacy on the Facebook. Privacy Enhancing Technology, 4258, 36–58.
Altman, I. (1975). The environment and social behavior. Monterey, CA: Brooks/Cole.
Angst, C. M., & Agarwal, R. (2009). Adoption of electronic health records in the presence of privacy concerns: The elaboration likelihood model and individual persuasion. MIS Quarterly, 33(2), 339–370.
Appelbaum, L. D., Lennon, M. C., & Aber, J. L. (2006). When effort is threatening: The influence of the belief in a just world on Americans’ attitudes toward antipoverty policy. Political Psychology, 27(3), 387–402.
Awad, N. F., & Krishman, M. S. (2006). The personalization privacy paradox: An empirical evaluation of information transparency and the willingness to be profiled online for personalization. MIS Quarterly, 30(1), 13–28.
Calo, M. (2010). People can be so fake: A new dimension to privacy and technology scholarship. Pennsylvania State Law Review, 9, 114.
Culnan, M. J., & Armstrong, P. K. (1999). Information privacy concerns, procedural fairness, and impersonal trust: An empirical investigation. Organization Science, 10(1), 104–115.
Culnan, M. J., & Bies, R. J. (2003). Consumer privacy: Balancing economic and justice considerations. Journal of Social Issues, 59(2), 323–342.
Culnan, M. J., & Williams, C. C. (2009). How ethics can enhance organizational privacy: Lessons from the Choicepoint and TJX data breaches. MIS Quarterly, 33(4), 673–687.
Dinev, T., & Hart, P. (2006). An extended privacy calculus model for e-commerce transactions. Information Systems Research, 17(1), 61–80.
Floridi, L. (2006a). Information ethics, its nature and scope. Computers and Society, 36(3), 21–36.
Floridi, L. (2006b). Four challenges for a theory of informational privacy. Ethics and Information Technology, 8(3), 109–119.
Ganong, L. H., & Coleman, M. (2006). Multiple segment factorial vignette designs. Journal of Marriage and Family, 69(2), 455–468.
Grimmelmann, J. (2010). Privacy as product safety. Widener Law Journal, 19, 793.
Hoofnagle, C. J, King, J., Li, S., & Turow, J. (2010). How different are young adults from older adults when it comes to information privacy attitudes and policies? (April 14, 2010). Available at SSRN: http://ssrn.com/abstract=1589864.
Hui, K., Teo, H., & Sang-Yong, T. L. (2007). The value of privacy assurance: An exploratory field experiment. MIS Quarterly, 31(1), 19–33.
Hull, G., Lipford, H. R., & Latulipe, C. (2010). Contextual gaps: Privacy issues on Facebook. Ethics and Information Technology, 13, 1–37.
Jasso, G. (1990). Factorial survey methods for studying beliefs and judgments. Sociology Methods and Research, 34(3), 334–423.
Jasso, G. (2006). Factorial survey methods for studying beliefs and judgments. Sociological Methods & Research, 34(3), 334–423.
Jasso, G., & Opp, K. (1997). Probing the character of norms: A factorial survey analysis of the norms of political action. American Sociological Review, 62, 947–964.
Johnson, D. (2004). Computer ethics. In L. Floridi (Ed.), The Blackwell guide to the philosophy of computer and information (pp. 65–75). Oxford, UK: Blackwell Publishers Limited.
Kennedy, P. (2003). A guide to econometrics (5th ed.). Cambridge, MA: MIT Press.
Kuo, F., Lin, C., & Hsu, M. (2007). Assessing gender differences in computer professionals’ self-regulatory efficacy concerning information privacy practices. Journal of Business Ethics, 73(2), 145–160.
Levitt, S. D., & List, J. A. (2007). What do laboratory experiments measuring social preferences reveal about the real world? The Journal of Economic Perspectives, 21(2), 153–174.
Lynch, J. G. (1982). The role of external validity in theoretical research. Journal of Consumer Research, 10(1), 109–111.
Malhotra, N. K., Kim, S. S., & Agarwal, J. (2004). Internet users’ information privacy concerns (IUIPC): The construct, the scale, and a causal model. Information Systems Research, 15(4), 336–355.
Margulis, S. T. (1977). Conceptions of privacy: Current status and next steps. Journal of Social Issues, 33(3), 5–21.
Martin, K. (2012). Diminshed or just different? A factorial vignette study of privacy as a social contract. Journal of Business Ethics. doi:10.1007/s10551-012-1215-8.
Marwick, A. E., Murgia-Diaz, D., & Palfrey, J. G. (2010). Youth, privacy and reputation (literature review). Berkman Center Research Publication No. 2010-5; Harvard Public Law Working Paper No. 10–29. Available at SSRN: http://ssrn.com/abstract=1588163.
Moor, J. (1985). What is computer ethics? Metaphilosophy, 16(4), 266–275.
Moor, J. (1997). Towards a theory of privacy in the information age. Computers and Society, September, 27–32.
Nissenbaum, H. (2004). Privacy as contextual integrity. Washington Law Review, 79(1), 119–158.
Nissenbaum, H. (2009). Privacy in context: Technology, policy, and the integrity of social life. Standford, CA: Stanford University Press.
Nock, S., & Gutterbock, T. M. (2010). Survey experiments. In J. Wright & P. Marsden (Eds.), Handbook of survey research (2nd ed., pp. 837–864). Bingley, UK: Emerald Group Publishing Lmtd.
O’Connell, A. A. (2005). Logistic regression models for ordinal response variables. Thousand Oaks, CA: Sage Publications.
Pavlou, P. A., Liang, H., & Xue, Y. (2007). Understanding and mitigating uncertainty in online exchange relationships: A principal-agent perspective. MIS Quarterly, 31(1), 105–136.
Posner, R. A. (1981). The economics of privacy. The American Economic Review, 71(2), 405–409.
Rossi, P., & Nock, S. (Eds.). (1982). Measuring social judgments: The factorial survey approach. Beverly Hills, CA: Sage.
Schoeman, F. (Ed.). (1984). Privacy: Philosophical dimensions of the literature. In Philosophical dimensions of privacy: An anthology. Cambridge: Cambridge University Press.
Smith, J. H., Dinev, T., & Xu, H. (2011). Information privacy research: An interdisciplinary review. MIS Quarterly, 35(4), 989–1015.
Smith, H. J., Milberg, S. J., & Burke, S. J. (1996). Information privacy: Measuring individuals’ concerns about organizational practices. MIS Quarterly, 20(2), 167–196.
Solove, D. J. (2006). A taxonomy of privacy. University of Pennsylvania Law Rev, 154(3), 477.
Tavani, H. T. (2008). Floridi’s ontological theory of informational privacy: Some implications and challenges. Ethics and Information Technology, 10(2–3), 155–166.
Taylor, B. J. (2006). Factorial surveys: Using vignettes to study professional judgment. British Journal of Social Work, 36, 1187–1207.
Thurman, Q. C., Lam, J. A., & Rossi, P. H. (1988). Sorting out the cuckoo’s nest: A factorial survey approach to the study of popular conceptions of mental illness. The Sociological Quarterly, 29(4), 565–588.
Tufekci, Z. (2008). Can you see me now? Audience and disclosure regulation in online social network sites. Bulletin of Science, Technology & Society, 28(1), 20–36.
Van de Hoven, J. (2008). Information technology, privacy, and the protection of personal data. In J. Weckert & J. Van de Hoven (Eds.), Information technology and moral philosophy (pp. 301–321). Cambridge: Cambridge University Press.
Wallander, L. (2009). 25 years of factorial surveys in sociology: A review. Social Science Research, 38, 505–520.
Weisband, S. P., & Reinig, B. A. (1995). Managing user perceptions of email privacy. Communications of the ACM, 38(12), 40–47.
Westin, A. (1967). Privacy and freedom. New York: Atheneum.
Young, A. L., & Quan-Haase, A. (2009). Information revelation and internet privacy concerns on social network sites: a case study of Facebook. C&T ‘09 Proceedings of the fourth international conference on communities and technologies.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
Sample vignettes
In general
[NAME] is a [MEMBERSHIP] college student [SPACE]. [LOCATION A] [NAME] [LOCATION B] from a fellow team member talking about [CONTENT]. [ACCESS]. The next day, [NAME] shared the information with [DISTRIBUTION].
Sample 1:
Ryan is a senior college student on an assigned project team for a required class. While on Facebook, Ryan received a newsfeed from a fellow team member talking about problems with his mom. Ryan was not sure that his teammate realized that he saw the information. The next day, Ryan shared the information with other students on the project team, including the professor.
Sample 2:
Kevin is a new college student on a varsity athletic team. While on Facebook, Kevin saw a wall post from a fellow team member talking about a date that went horribly wrong. Kevin was not sure that his teammate realized that he saw the information. The next day, Kevin shared the information with other members of the team.
Vignette factors
Attributes | Dimensions | Operationalized | ||
---|---|---|---|---|
1 | Space | 0 | Well defined—athletic team | On a varsity athletic team |
1 | Ill defined—randomly assigned group | On an assigned project team for a required class | ||
2 | Access | 0 | Give willingly | |
1 | Coerced | [NAME]’s teammate only shared the information reluctantly after being chided by other students on the team. | ||
2 | Overheard | [NAME] was not sure that his teammate realized that he heard/received the information. | ||
3 | Content | 0 | Public | Housing decisions for next semester |
1 | Role based | Who is going to start for the next game/how the projects were assigned | ||
2 | Personal I | A date that went horribly wrong | ||
3 | Family | Problems with his mom | ||
4 | Private | An embarrassing medical condition | ||
4 | Location | 0 | Verbal inside role-based space | While in the locker room/study room…heard |
1 | Verbal outside role-based space | While in the cafeteria…heard | ||
2 | While checking his messages….received an e-mail | |||
3 | Facebook newsfeed | While on Facebook…received a newsfeed | ||
4 | Facebook wall post | While on Facebook…saw a wall post | ||
5 | Distribution of information | 0 | Distributed within group | Other members of the team |
1 | Distributed to team leaders | Other members of the team including the coach | ||
2 | Distributed to captains | Other members of the team including the team captains | ||
3 | Distributed outside group | Students not on the team | ||
6 | Membership | 0 | New | New |
1 | Senior | Senior |
Question 1: Should [NAME] have shared the information with others? | ||||||||
---|---|---|---|---|---|---|---|---|
Absolutely should share | OK to share | Absolutely should not share | ||||||
0 | 1 | 2 | 3 | 4 | ||||
|
|
|
|
|
|
|
|
|
Rights and permissions
About this article
Cite this article
Martin, K. Information technology and privacy: conceptual muddles or privacy vacuums?. Ethics Inf Technol 14, 267–284 (2012). https://doi.org/10.1007/s10676-012-9300-3
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10676-012-9300-3