Abstract
The development and application of artificial intelligence (AI) technology has raised many concerns about privacy violations in the public. Thus, privacy-preserving computation technologies (PPCTs) have been developed, and it is expected that these new privacy protection technologies can solve the current privacy problems. By not directly using raw data provided by users, PPCTs claim to protect privacy in a better way than their predecessors. They still have technical limitations, and considerable research has treated PPCTs as a privacy-protecting tool and focused on possible technical improvements. In this article, we argue that PPCTs cannot effectively protect privacy in the narrow sense due to their technical limitations. Moreover, although these shortcomings could be remedied, they still fall into a paradox because their aim is to protect privacy, but they could reveal private information from users. This paradox of privacy protection not only aggravates the social impact arising from AI privacy issues but may also make the current privacy protection lose its meaning, resulting in a situation in which privacy protection is useless.
Similar content being viewed by others
References
Acquisti A, Brandimarte L, Loewenstein G (2015) Privacy and human behavior in the age of information. Science 347(6221):509–514
Agrawal N, Binns R, Van Kleek M, Laine K, Shadbolt N (2021) Exploring design and governance challenges in the development of privacy-preserving computation. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3411764.3445677
Alkady Y, Farouk F, Rizk R (2019) Fully homomorphic encryption with AES in cloud computing security. In: Hassanien AE et al (Eds) International Conference on Advanced Intelligent Systems and Informatics. Springer, Cham, pp 370–382. https://doi.org/10.1007/978-3-319-99010-1_34
Almashaqbeh G, Solomon R (2021) SoK: privacy-preserving computing in the blockchain era. https://eprint.iacr.org/2021/727.pdf. Accessed 10 Dec 2021
Al-Rfou R, Pickett M, Snaider J, et al (2016) Conversational contextual cues: the case of personalization and history for response ranking. https://doi.org/10.48550/arXiv.1606.00372. Accessed 10 Dec 2021
Asokan A (2020). Unsecured Estee lauder database exposed 440 million records. Bank Infor Security, 12 February. https://www.bankinfosecurity.com/unsecured-estee-lauder-database-exposed-440-million-records-a-13712. Accessed 30 May 2022
Bagdasaryan E, Veit A, Hua Y, Estrin D, Shmatikov V (2020). How to backdoor federated learning. Proceedings of the 23rdInternational Conference on Artificial Intelligence and Statistics (AISTATS), Italy, 2938–2948
Big Data UN Global Working Group (2021) UN handbook on privacy-preserving computation techniques. https://unstats.un.org/bigdata/task-teams/privacy/UN%20Handbook%20for%20Privacy-Preserving%20Techniques.pdf. Accessed 10 Dec 2021
Bonawitz K, Ivanov V, Kreuter B, Marcedone A, et al (2017) Practical secure aggregation for privacy-preserving machine learning. Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security—CCS ’17. https://doi.org/10.1145/3133956.3133982
Bracanović T (2019) Predictive analytics personalized marketing and privacy. Revue Roumaine De Philosophie 63(2):263–275
Calo R (2011) The boundaries of privacy harm. Indiana Law J 86(3):1131–1162
Chakrabarti S, Knauth T, Kuvaiskii D, Steiner M, Vij M (2020) Trusted execution environment with intel sgx. In: Jiang XQ, Tang HX (eds) Responsible genomic data sharing: challenges and approaches. Academic Press, Cambridge, pp 161–190
Char DS, Shah NH, Magnus D (2018) Implementing machine learning in health care—addressing ethical challenges. N Engl J Med 378(11):981–983
Chen B, Zhao N. (2014). Fully homomorphic encryption application in cloud computing. 2014 11th International Computer Conference on Wavelet Actiev Media Technology and Information Processing (ICCWAMTIP). https://doi.org/10.1109/iccwamtip.2014.7073452
Cheng X, Tang P, Su S, Chen R, Wu Z, Zhu B (2019) Multi-party high-dimensional data publishing under differential privacy. IEEE Trans Knowl Data Eng 32(8):1557–1571
Cuff P, Yu L (2016) Differential privacy as a mutual information constraint. Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. https://doi.org/10.1145/2976749.2978308
De Cristofaro E (2021) A critical overview of privacy in machine learning. IEEE Secur Priv 19(4):19–27
Dwork C (2006) Differential privacy. In: Bugliesi M et al (eds) International colloquium on automata, languages, and programming. Springer, Berlin, Heidelberg, pp 1–12
Dwork C (2008) Differential privacy: A survey of results. In: Agrawal M et al (eds) International conference on theory and applications of models of computation. Springer, Berlin, Heidelberg, pp 1–19
Dwork C (2011) A firm foundation for private data analysis. Commun ACM 54(1):86–95
Dwork C, Lei J (2009). Differential privacy and robust statistics. Proceedings of the forty-first annual ACM symposium on Theory of computing. https://doi.org/10.1145/1536414.1536466
Dwork C, McSherry F, Nissim K, Smith A (2006) Calibrating noise to sensitivity in private data analysis. In: Halevi S, Rabin T (eds) Theory of cryptography conference. Springer, Berlin, Heidelberg, pp 265–284
Fang H, Qian Q (2021) Privacy preserving machine learning with homomorphic encryption and federated learning. Future Internet 13(4):1–20
Fredrikson M, Lantz E, Jha S, Lin S, Page D, Ristenpart T (2014) Privacy in pharmacogenetics: an end-to-end case study of personalized warfarin dosing. In: 23rd USENIX Security Symposium (USENIX Security 14). PMC, pp 17–32
Gartner (2021) Gartner top strategic technology trends for 2022. https://www.groupbdo.com/post/gartner-top-strategic-technology-trends-for-2022. Accessed 10 Dec 2021
Global Platform (2018) TEE system architecture v1.0. https://globalplatform.org/wp-content/uploads/2018/09/GPD_TEE_SystemArch_v1.1.0.10-for-v1.2_PublicReview.pdf. Accessed 10 Dec 2021
Harari YN (2017) Reboot for the AI revolution. Nature 550(7676):324–327
Hardy S, Henecka W, Ivey-Law H, Nock R, Patrini G, Smith G, Thorne B (2017) Private federated learning on vertically partitioned data via entity resolution and additively homomorphic encryption. https://ui.adsabs.harvard.edu/abs/2017arXiv171110677H/abstract. Accessed 10 Dec 2021
He J, Baxter SL, Xu J, Xu J, Zhou X, Zhang K (2019) The practical implementation of artificial intelligence technologies in medicine. Nat Med 25(1):30–36
Heller M (2019). A recent history of Facebook security and privacy issues. TechTarget, April 30. https://www.techtarget.com/searchsecurity/news/252462588/A-recent-history-of-Facebook-security-and-privacy-issues. Accessed 29 May 2022
Hirt M, Maurer U, Przydatek B (2000) Efficient secure multi-party computation. In: Okamoto T (ed) International conference on the theory and application of cryptology and information security. Springer, Berlin, Heidelberg, pp 143–161
Horvitz E, Mulligan D (2015) Data, privacy, and the greater good. Science 349(6245):253–255
Hunkenschroer AL, Luetge C (2022) Ethics of AI-enabled recruiting and selection: a review and research agenda. J Bus Ethics 178:977–1007
Jarin I, Eshete B (2021). Pricure: privacy-preserving collaborative inference in a multi-party setting. Proceedings of the 2021 ACM Workshop on Security and Privacy Analytics, New York, USA, 25–35. https://doi.org/10.1145/3445970.3451156
Jobin A, Ienca M, Vayena E (2019) The global landscape of AI ethics guidelines. Nat Mach Intell 1(9):389–399
Kaissis G, Ziller A, Passerat-Palmbach J et al (2021) End-to-end privacy preserving deep learning on multi-institutional medical imaging. Nat Mach Intell 3(6):473–484
Kohlbrenner D, Shinde S, Lee D, Asanović K, Song D (2020) Building open trusted execution environments. IEEE Secur Priv 18(5):47–56
Konečný J, McMahan H B, Ramage D, Richtárik P (2016a) Federated optimization: distributed machine learning for on-device intelligence. https://doi.org/10.48550/arXiv.1610.02527 Accessed 10 Dec 2021
Konečný J, McMahan H B, Yu F X, Richtárik P, Suresh A T, Bacon D (2016b) Federated learning: strategies for improving communication efficiency. https://doi.org/10.48550/arXiv.1610.05492. Accessed 10 Dec 2021
Lai CP, Ding C (2004) Several generalizations of Shamir’s secret sharing scheme. Int J Found Comput Sci 15(2):445–458
Li FH, Li H, Niu B, Chen JJ (2019) Privacy computing: concept, computing framework, and future development trends. Engineering 5(6):1179–1192
Liu Y T, Chen C C, Zheng L F, Wang L, et al (2020) Privacy preserving pca for multiparty modeling. https://doi.org/10.48550/arXiv.2002.02091. Accessed 10 Dec 2021
McSherry F D (2009) Privacy integrated queries: an extensible platform for privacy-preserving data analysis. Proceedings of the 2009 ACM SIGMOD International Conference on Management of data. https://doi.org/10.1145/1559845.1559850
Mühlhoff R (2021) Predictive privacy: towards an applied ethics of data analytics. Ethics Inf Technol 23(4):675–690
Munir AB, Mohd Yasin SH, Muhammad-Sukki F (2015) Big data: big challenges to privacy and data protection. Int Scholarly Sci Res Innovation 9(1):355–363
Niknam S, Dhillon HS, Reed JH (2020) Federated learning for wireless communications: Motivation, opportunities, and challenges. IEEE Commun Mag 58(6):46–51
Ning ZY, Zhang FW, Shi WS, Shi WD (2017) Position paper: challenges towards securing hardware-assisted execution environments. Proceedings of the Hardware and Architectural Support for Security and Privacy. https://doi.org/10.1145/3092627.3092633
Ogburn M, Turner C, Dahal P (2013) Homomorphic encryption. Proc Comput Sci 20:502–509
OMTP (2009) OMTP advanced trusted environment OMTP TR1v1.1. http://www.omtp.org/OMTP_Advanced_Trusted_Environment_OMTP_TR1_v1_1.pdf. Accessed 10 Dec 2021
Paganini P (2022) Broward Health suffered a data breach that impacted +1.3 million people. Security Affairs, 4 January. https://securityaffairs.co/wordpress/126285/data-breach/broward-health-data-breach.html. Accessed 30 May 2022
Panetta K (2020) Gartner top strategic technology trends for 2021. Gartner, 19 October. https://www.gartner.co.uk/en/articles/gartner-top-strategic-technology-trends-for-2021. Accessed 10 Dec 2021
Ponemon Institute (2020) Cost of a data breach report 2020. https://www.ibm.com/security/digital-assets/cost-data-breach-report. Accessed 30 May 2022
Price WN, Cohen IG (2019) Privacy in the age of medical big data. Nat Med 25(1):37–43
Ramesh S, Govindarasu M (2020) An efficient framework for privacy-preserving computations on encrypted IoT data. IEEE Internet Things J 7(9):8700–8708
Read S (2021) 40 million T-mobile customers hit by US data breach. BBC, 18 August. https://www.bbc.com/news/business-58263521. Accessed 30 May 2022
Rivest RL, Adleman L, Dertouzos ML (1978a) On data banks and privacy homomorphisms. Foundations Secure Comput 4(11):169–180
Rivest RL, Shamir A, Adleman L (1978b) A method for obtaining digital signatures and public-key cryptosystems. Commun ACM 21(2):120–126
Sharma A X (2020) Security firm admits to exposure of 5 billion records, after attempting to censor researchers. Security Report, 10 June. https://securityreport.com/security-firm-admits-to-exposure-of-5-billion-records-after-attempting-to-censor-researchers/. Accessed 30 May 2022
Truex S, Liu L, Gursoy ME, Yu L, Wei W (2019) Demystifying membership inference attacks in machine learning as a service. IEEE Trans Serv Comput 14(6):2073–2089
Wang X, He J, Cheng P, Chen J (2019) Privacy preserving collaborative computing: Heterogeneous privacy guarantee and efficient incentive mechanism. IEEE Trans Signal Process 67(1):221–233
Wang C, Ma C, Li M, Gao N, Zhang Y, Shen Z (2021) Protecting data privacy in federated learning combining differential privacy and weak encryption. In: Lu WL, Sun K, Yung M, Liu F (eds) International conference on science of cyber security. Springer, Cham, pp 95–109
Wei Q, Li Q, Zhou Z, Ge Z, Zhang Y (2020) Privacy-preserving two-parties logistic regression on vertically partitioned data using asynchronous gradient sharing. Peer-to-Peer Netw Appl 14(3):1379–1387
Willemson J (2019) How not to use a privacy-preserving computation platform: case study of a voting application. In: Katsikas S et al (eds) Computer security. Springer, Cham, pp 111–121
Yang Q, Liu Y, Chen T, Tong Y (2019) Federated machine learning: concept and applications. ACM Transact Intell Syst Technol (TIST) 10(2):1–19
Yao A C (1982). Protocols for secure computations. 23rd annual symposium on foundations of computer science (sfcs 1982), Chicago, USA, 160–164. https://doi.org/10.1109/SFCS.1982.38
Yao A C C (1986). How to generate and exchange secrets. In 27th Annual Symposium on Foundations of Computer Science (sfcs 1986), Toronto, Canada, 162–167. https://doi.org/10.1145/266420.266424
Zhang T, Zhu T, Xiong P, Huo H, Tari Z, Zhou W (2019) Correlated differential privacy: feature selection in machine learning. IEEE Trans Industr Inf 16(3):2115–2124
Zhu H, Zhang H, Jin Y (2021) From federated learning to federated neural architecture search: a survey. Complex Intell Syst 7(2):639–657
Funding
This work was supported by National Office for Philosophy and Social Sciences under Grant 19CZX043.
Author information
Authors and Affiliations
Contributions
Bin Ye contributed to the ideation and conceptualisation of the manuscript. Xiao-yu Sun and Bin Ye contributed to the writing, editing and revision of the drafts of the manuscript.
Corresponding author
Ethics declarations
Conflict of interest
The authors report there are no competing interests to declare.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Sun, Xy., Ye, B. Privacy preserving or trapping?. AI & Soc (2022). https://doi.org/10.1007/s00146-022-01610-z
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s00146-022-01610-z