Abstract
Advances in technology have transformed and expanded the ways in which policing is run. One new manifestation is the mass acquisition and processing of private facial images via automatic facial recognition by the police: what we conceptualise as AFR-based policing. However, there is still a lack of clarity on the manner and extent to which this largely-unregulated technology is used by law enforcement agencies and on its impact on fundamental rights. Social understanding and involvement are still insufficient in the context of AFR technologies, which in turn affects social trust in and legitimacy and effectiveness of intelligent governance. This article delineates the function creep of this new concept, identifying the individual and collective harms it engenders. A technological, contextual perspective of the function creep of AFR in policing will evidence the comprehensive creep of training datasets and learning algorithms, which have by-passed an ignorant public. We thus argue individual harms to dignity, privacy and autonomy, combine to constitute a form of cultural harm, impacting directly on individuals and society as a whole. While recognising the limitations of what the law can achieve, we conclude by considering options for redress and the creation of an enhanced regulatory and oversight framework model, or Code of Conduct, as a means of encouraging cultural change from prevailing police indifference to enforcing respect for the human rights violations potentially engaged. The imperative will be to strengthen the top-level design and technical support of AFR policing, imbuing it with the values implicit in the rule of law, democratisation and scientisation-to enhance public confidence and trust in AFR social governance, and to promote civilised social governance in AFR policing.
Similar content being viewed by others
Notes
R (Bridges) v CCSWP and SSHD [2019] EWHC 2341 (Admin).
Ibid. [7].
R (Bridges) v CC South Wales & ors [2020] EWCA Civ 1058.
The full text of the draft is available at:
https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52021PC0206.
‘Detecting faces in an image’:
https://docs.aws.amazon.com/rekognition/latest/dg/faces-detect-images.html.
Hallard, Bobby (27 February 2020) Clearview AI client list hacked:
https://www.itpro.com/security/data-breaches/354866/clearview-ai-client-list-hacked.
R (Bridges) (n 1) [2019] [33].
R(Wood) (n 6) [20]-[21]; Re JR (n 6) [86].
R (Bridges) (n 1) [52].
European Convention for the Protection of Human Rights and Fundamental Freedoms, 3 Sept. 1953, ETS 5, 213 UNTS 221.
R(Wood) v Commissioner of Police of the Metropolis [2009] EWCA Civ 414, [2009] 4 All ER 951 [22]; Re JR 38 [2016] AC 1131 [86].
R (Bridges) (n 1) [51].
This doctrine was established in von Hanover v Germany (2004) 40 EHRR 1. See also Campbell v MGN [2004] UKHL 22.
Perry v United Kingdom (2004) 39 EHRR 3 [38]; PG v United Kingdom (2008) 46 EHRR 51 [57]; Lopez Ribalda v Spain (2020) 71 EHRR 7 [89].
R (Bridges) (n 1) [60].
R(Wood) (n 6) [22]; Re JR (n 6) [86].
R (Bridges) (n 1) [51].
R(Wood) (n 6) [36]-[37].
R (Bridges) (n 1) [57].
S v United Kingdom (2009) 48 EHRR 50 [67]. In the case of interception of communications, the Strasbourg Court considered the initial collection, retention and subsequent use of the relevant information as a separate interference with Article 8. Amann v Switzerland (2000) 30 EHRR 843 [GC] [48], [69].
R (Bridges) (n 1) [59].
Rynes v Urad [2015] 1 WLR 2607 [22].
R (Bridges) (n 3) [85]-[89].
Identification should state that an AFR camera is in use; that it is processing biometric data; and that the data is being processed by the police, to achieve a particular established purpose.
R (Bridges) (n 1) [39].
Ibid. [40]; R (Bridges) (n 3) [20].
R(T) v Chief Constable of Greater Manchester[2015] AC 49 [88]-[89].
Consider Gaughran v United Kingdom [2020] 2 WLUK 607; The Times 22 April 2020 (ECHR).
Stephanie Kanowitz, Facial recognition inappropriate for high-risk applications, experts say (25 March 2022):
See also similarly the Regulation of Investigatory Powers (Scotland) Act 2000.
United States v Mendenhall 446 US 544 (1980); INS v Delgado 466 US 210 (1984).
Kyllo v United States, 533 US 27 (2001).
Mollett v State, 939 P 2d 1, 11 (Okla Crim App 1997).
United States v Dionisio, 410 US 1, 14 (1973).
See the Explanatory Report accompanying the Protocol of the Council of Europe Convention for the Protection of Individuals with regard to the Automatic Processing of Personal Data [59].
R (Bridges) (n 1) [133]; s 35(8)(b) of the DPA 2018.
Ibid. [37].
R (Bridges) (n 3) [91],[96].
Ibid. [13].
Ibid. [121]-[125]; R (Bridges) (n 1) [30].
Para 1 of Schedule 8 to DPA 2018.
R (Bridges) (n 1) [136], [137].
Ibid. [99].
Ibid. [102].
Ibid. [104].
Ibid. [103].
R (Bridges) (n 3) [140]-[143].
R (Bridges) (n 1) [101].
R (Bridges) (n 3) [60].
Ibid. [16].
R (Bridges) (n 1) [31].
Ibid. [96].
Secretary of State’s Surveillance Camera Code of Practice, ‘Guiding Principles’ 1, 5, 11, and paras 1.8, 2.4.
See ss 35(5) and 42(2) of DPA 2018.
R (Bridges) (n 1) [139].
Ibid. [140], [141].
Ibid. [139]; R(Unison) v Lord Chancellor [2016] ICR1 [106].
R (Bridges) (n 1) [148].
R (Bridges) (n 3) [152]-[153].
In the exercise of their functions, public authorities must give due consideration to three matters: the elimination of discrimination, harassment, persecution and any other acts prohibited by the Equality Act, and the promotion of equal opportunities and good relations between persons with relevant protected characteristics and persons who do not. See s 149(1).
R (Bridges) (n 3) [176], [179], [181]. MPS claims that its AFR uses ‘the latest accurate algorithm’. Metropolitan Police ‘Live Facial Recognition’:
https://www.met.police.uk/advice/advice-and-information/facial-recognition/live-facial-recognition/. That in itself will not necessarily fulfil the positive obligation of assuring no bias.
R (Bridges) (n 1) [153].
R (Bridges) (n 3) [198]-[201].
Ibid. [176].
Ibid. [206].
References
Adelaide B, Kelly H, Robert F (2021) ‘Only in our best interest, right?’ public perceptions of police use of facial recognition technology. Police Pract Res 22:1637–1654
Ali W et al (2021) Classical and modern face recognition approaches: a complete review. Multimed Tools Appl 80(3):4825–4880
Alikhademi K et al (2022) A review of predictive policing from the perspective of fairness. Artif Intell Law 30:1–17
Amazon (2020) Use cases that involve public Safety. https://docs.aws.amazon.com/rekognition/latest/dg/considerations-public-safety-use-cases.html
Ammanath B (2021) Facial recozgnition: here’s looking at you-a report by the Deloitte AI Institute.https://www2.deloitte.com/content/dam/Deloitte/us/Documents/technology/us-ai-institute-facial-recognition.pdf
Anwarul S and Dahiya S (2020) A comprehensive review on face recognition methods and factors affecting facial recognition accuracy. In: Singh PK et al (eds.), Proceedings of ICRIC 2019, Lecture notes, Springer Nature Switzerland AG, pp 495–514
Article 29 Working Party (2014) Opinion 01/2014 on the application of necessity and proportionality concepts and data protection within the law enforcement sector (WP 211)
Article 29 Working Party (2017) Opinion on some key issues of the law enforcement directive (EU 2016/680) (WP 258)
Baker DJ, Robinson PH (2021) Emerging technologies and the criminal law. In: Baker DJ, Robinson PH (eds) Artificial intelligence and the law: cybercrime and criminal liability. Routledge
Barr JR et al (2012) Face recognition from video: a review. Int J Pattern Recogn Artif Intell 26(05):1266002
Barrett LF, Adolphs R, Marsella S (2019) Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements. Psychological Science in the Public Interest 20(1):1–68. https://doi.org/10.1177/1529100619832930
Berle I (2020) Face recognition technology: compulsory visibility and its impact on privacy and the confidentiality of personal identifiable images. Springer Nature, Switzerland AG
Biometrics and surveillance camera commissioner (2021) Fraser Sampson and Brian Plastow's joint letter to members of government regarding the DCMS data consultation. https://www.gov.uk/government/publications/fraser-sampson-and-brian-plastow-letter-on-dcms-data-consultation/fraser-sampson-and-brian-plastows-joint-letter-to-members-of-government-regarding-the-dcms-data-consultation-accessible-version
Bradford B et al (2020) Live facial recognition: trust and legitimacy as predictors of public support for police use of new technology. BJC 60:1502–1522
Canal FZ et al (2022) A survey on facial emotion recognition techniques: a state-of-the-art literature review. Inf Sci 582:593–617
Centre for data ethics and innovation (2020) Independent report: snapshot paper-facial recognition technology. https://www.gov.uk/government/publications/cdei-publishes-briefing-paper-on-facial-recognition-technology/snapshot-paper-facial-recognition-technology
Chan Gary KY (2021) Towards a calibrated trust-based approach to the use of facial recognition technology. Int J Law Inf Technol 29:305–331
Chesterman S (2011) One nation under surveillance: a new social contract to defend freedom without sacrificing liberty. Oxford University Press, Oxford
Christin A (2017) Algorithms in practice: comparing web journalism and criminal justice. Big Data Soc’y 4:1–14
Citron DK, Pasquale F (2014) The scored society: due process for automated predictions. Wash Law Rev 89:1–33
Cosmi et al (2009) Biometrics: security vs privacy. A scientific and bioethical point of view. In: Mordini E, Green M (eds) Identity, security and democracy: the wider social and ethical implications of automated systems for human identification. IOS Press, pp 57–68
Curry MR (1997) The digital individual and the private realm. Ann Assoc Am Geogr 87:681–699
Degeling M, Berendt B (2018) What is wrong about Robocops as consultants? A technology-centric critique of predictive policing. Ai Soc 33(3):347–356. https://doi.org/10.1007/s00146-017-0730-7
Du H et al (2021) The elements of end-to-end deep face recognition: a survey of recent advances. arXiv:2009.13290
Dworkin G (1988) The theory and practice of autonomy. Cambridge University Press, Cambridge
Edwards L, Urquhart L (2016) Privacy in public spaces: what expectations of privacy do we have in social media intelligence? Int J Law Inf Technol 24:279–310
Eneman M et al (2022) The sensitive nature of facial recognition: tensions between the Swedish police and regulatory authorities1. Inf Polity 27(2):219–232
Englich B et al (2006) Playing Dice with Criminal Sentences: The Influence of Irrelevant Anchors on Experts’ Judicial Decision Making. Personal Soc Psychol Bull 32(2):188–200. https://doi.org/10.1177/0146167205282152
European Agency for Fundamental Rights (2019) Facial recognition technology: fundamental rights considerations in the context of law enforcement.https://fra.europa.eu/en/publication/2019/facial-recognition-technology-fundamental-rights-considerations-context-law
FOI (2012) Definition of policing by consent. https://www.govuk/government/publications/policing-by-consent/definition-of-policing-by-consent
Fretty DA (2011) Face-recognition surveillance: a moment of truth for fourth amendment rights in public places. Virginia J Law Technol 16:1522–1687
Fussey P et al (2021) ‘Assisted’ facial recognition and the reinvention of suspicion and discretion in digital policing. BJC 61:325–344
Fussey P and Murray D (2019) Independent report on the london metropolitan police service’s trial of facial recognition technology.http://repository.essex.ac.uk/24946/
GAO (2021) Facial recognition technology: federal law enforcement agencies should better assess privacy and other risks (21–518)
Gates KA (2011) Our biometric future: AFR technology and the culture of surveillance. New York University Press, New York
Gordon N (2002) On visibility and power: an Arendtian corrective of Foucault. Hum Stud 25:125–145
Grother P, Ngan M, and Hanaoka K (2019a) Face recognition vendor test (FRVT) Part 2: Identification (Interagency Report 8271, National Institute of Standards and Technology)
Grother P, Ngan M, and Hanaoka K (2019b) Face recognition vendor test (FRVT) Part 3: Demographic Effects (Interagency Report 8280, National Institute of Standards and Technology)
HAI (2020) Evaluating facial recognition technology: a protocol for performance assessment in new domains. https://hai.stanford.edu/sites/default/files/2020-11/HAI_FacialRecognitionWhitePaper_Nov20.pdf
Hamann K, Smith R (2019) Facial recognition technology. Crim Justice 34:9–13
Harel A (2009) Biometrics, identification and practical ethics. In: Mordini E, Green M (eds) Identity, security and democracy: the wider social and ethical implications of automated systems for human identification. IOS Press, pp 69–84
Hartzog W, Stutzman F (2013) Obscurity by Design. Wash L Rev 88:385–418
Hautala L (2019) San Francisco becomes first city to bar police from using facial recognition. https://www.cnet.com/news/san-francisco-becomes-first-city-to-bar-police-from-using-facial-recognition/
Hern A (2020) IBM quits facial-recognition market over police racial-profiling concerns. https://www.theguardian.com/technology/2020/jun/09/ibm-quits-facial-recognition-market-over-law-enforcement-concerns
Hill D, O’Connor CD, Slane A (2022) Police use of facial recognition technology: The potential for engaging the public through co-constructed policy-making. Int J Polic Sci Manag 24(3):325–335
House of Commons Science and Technology Committee (2015) Current and future uses of biometric data and technologies (HC 734).https://publications.parliament.uk/pa/cm201415/cmselect/cmsctech/734/73405.htm
House of Commons Science and Technology Committee (2019) The work of the Biometrics Commissioner and the Forensic Science Regulator.https://publications.parliament.uk/pa/cm201719/cmselect/cmsctech/1970/197004.htm
Hu M (2018) Bulk biometric metadata collection. N C L Rev 96:1425–1474
Information Commissioner’s Office (2019) ICO investigation into how the police use facial recognition technology in public places.https://ico.org.uk/media/about-the-ico/documents/2616185/live-frt-law-enforcement-report-20191031.pdf
Information Commissioner’s Opinion (2019) The use of live facial recognition technology by law enforcement in public places.https://ico.org.uk/media/about-the-ico/documents/2616184/live-frt-law-enforcement-opinion-20191031.pdf
Introna LD and Nissenbaum H (2009) Facial recognition technology: a survey of policy and implementation issues. https://nissenbaum.tech.cornell.edu/papers/facial_recognition_report.pdf
Jagadeesha S (2017) Artificial intelligence-facial recognition. IJRET 6:22–26
Jain AK et al (2021) Biometrics: trust, but verify. ArXiv abs/2105.06625
Jayaraman U et al (2020) Recent development in face recognition. Neurocomputing 408:231–245
Justice Sub-Committee on Policing (2020) Facial recognition: how policing in Scotland makes use of this technology (SP Paper 678 1st Report).https://digitalpublications.parliament.scot/Committees/Report/JSP/2020/2/11/Facial-recognition--how-policing-in-Scotland-makes-use-of-this-technology
Kak A (2020) Regulating biometrics: global approaches and urgent questions. https://ainowinstitute.org/regulatingbiometrics.html
Kaur P et al (2020) Facial-recognition algorithms: a literature review. Med Sci Law 60(2):131–139
Keenan B (2021) Automatic facial recognition and the intensification of police surveillance. MLR 84:886–897
Kindt EJ (2013) Privacy and data protection issues of biometric applications: a comparative legal analysis. Springer, Netherlands
Koops BJ (2021) The concept of function creep. Law Innovation and Technol 13:29–56
Kukielski K (2022) The first amendment and facial recognition technology. Loy LAL Rev 55:231
Lee PX et al (2021) Develop a hybrid human face recognition system based on a dual deep neural network by interactive correction training. In: Nguyen NT et al (eds) Intelligent information and database systems. Springer Nature, Switzerland AG, pp 593–605
Leslie D (2020) Understanding bias in facial recognition technology: An explainer.https://www.turing.ac.uk/sites/default/files/2020-10/understanding_bias_in_facial_recognition_technology.pdf
Lohr S (2018) Facial recognition is accurate, if you’re a white guy. New Yorker Times
Lynch J (2020) Face off: law enforcement use of face recognition technology.https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3909038
Lyon D (2003) Surveillance after september 11. Polity Press
Mann M, Smith M (2017) Automated facial recognition technology: recent developments and approaches to oversight. UNSW Law J Vol 40:121–145
Marciano A (2019) Reframing biometric surveillance: From a means of inspection to a form of control. Ethics Inf Technol 21:127–136
Marks P (2021) Can the biases in facial recognition be fixed; also, should They? Commun ACM 64(3):20–22. https://doi.org/10.1145/3446877
Martin AK, Donovan KP (2015) New surveillance technologies and their publics: a case of biometrics Public. Underst Sci 24:842–857
McGlynn C, Rackley E (2017) Image-based sexual abuse. OJLS 37:534–561
McLaughlin M and Castro D (2020) The critics were wrong: nist data shows the best facial recognition algorithms are neither racist nor sexist.https://www.readkong.com/page/the-critics-were-wrong-nist-data-shows-the-best-facial-8018631
Menegus B (2019) Defense of amazon’s face recognition tool undermined by its only known police client.https://gizmodo.com/defense-of-amazons-face-recognition-tool-undermined-by-1832238149
Metz R (2019) California lawmakers ban facial-recognition software from police body cams. https://edition.cnn.com/2019/09/12/tech/california-body-cam-facial-recognition-ban/index.html
Monaghan K (2013) Monaghan on equality law, 2nd edn. Oxford University Press
Murphy JR (2018) Chilling: the constitutional implications of body-worn cameras and facial recognition technology at public protests. Wash Lee L Rev 75:1–32
NISR (2020) Facial recognition technology (FRT).https://www.nist.gov/speech-testimony/facial-recognition-technology-frt-0
Note (2007) In the face of danger: facial recognition and the limits of privacy law. HLR 120:1870–1891
Office of the Privacy Commissioner of Canada et al (2021) Joint investigation of Clearview AI, Inc.https://priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2021/pipeda-2021-001
Office of the Privacy Commissioner of Canada (2021) Police use of facial recognition technology in Canada and the way forward.https://priv.gc.ca/en/opc-actions-and-decisions/ar_index/202021/sr_rcmp/
Office of the Privacy Commissioner of Canada (2022) Privacy guidance on facial recognition for police agencies.https://www.priv.gc.ca/en/privacy-topics/surveillance/police-and-public-safety/gd_fr_202205/
Pallitto RM (2013) Bargaining with the machine: a framework for describing encounters with surveillance technologies. Surveill Soc 11:4–17
Partnership on AI (2020) Understanding facial recognition systems.https://partnershiponai.org/wp-content/uploads/2021/08/Understanding-Facial-Recognition-Paper_final.pdf
Patel K et al (2020) Facial sentiment analysis using AI techniques: state-of-the-art, taxonomies, and challenges. IEEE Access 8:90495–90519
Purshouse J, Campbell L (2019) Privacy, crime control and police use of automated facial recognition technology. Crim Law Rev 2019(3):188–204
Purshouse J, Campbell L (2022) Automated facial recognition and policing: a Bridge too far? Leg Stud 42:209–227
Rawls J (1971) A Theory of Justice. Harvard University Press
Raysman R and Brown P (2016) How has facial recognition impacted the law? NYL J
Reidenberg JR (2000) Resolving conflflicting international data privacy rules in cyberspace. Stanf Law Rev 52:1315–1371
Rezende IN (2020) Facial recognition in police hands: assessing the ‘Clearview case’from a European perspective. New J Eur Crim Law 11(3):375–389
Rosenthal JN, Oberly DJ, Noonan AM (2022) From fingerprints to facial recognition: scanning developments in biometric technology. J Robot Artif Intell Law 5:123–128
RoyChowdhury A et al (2020) Improving face recognition by clustering unlabeled faces in the wild. In proceedings of the European conference on computer vision. pp. 119–136
Safdar Muhammad et al (2016) Function creep in surveillance techniques. IJSRSET 2:983–988
Seshia SA et al (2022) Toward verified artificial intelligence. Commun ACM 65:46–55
Sharif M et al (2017) Face recognition: a survey. J Eng Sci Technol Rev 10:166–177
Skeem J, Scurich N, Monahan J (2020) Impact of Risk Assessment on Judges’ Fairness in Sentencing Relatively Poor Defendants. Law and human behavior 44(1):51
Slane A (2021) Privacy protective roadblocks and speedbumps restraining law enforcement use of facial recognition software in Canada. Criminal Law Quarterly 69(2):216–236
Smith M, Miller S (2022) The ethical application of biometric facial recognition technology. AI Soc 37(1):167–175
Snyder E (2018) Faceprints and the fourth amendment: how the FBI uses facial recognition technology to conduct unlawful searches. Syracuse L Rev 68:255–276
Springer A et al (2018) Dice in the Black Box: user experiences with an inscrutable algorithm. arxiv.org/abs/1812.03219
Stevenson M (2018) Assessing risk assessment in action. Minn L Rev 103:303–384
Stevenson MT. and Doleac JL (2018) The roadblock to reform. https://www.acslaw.org/wp-content/uploads/2018/11/RoadblockToReformReport.pdf
Stupp C (2020) EU plans rules for facial-recognition technology. Wall Street J. https://www.wsj.com/ articles/eu-plans-rules-forfacial-recognition-technology-11582219726
Tagg J (1988) The burden of representation: essays on photographies and histories. Palgrave Macmillan
The Law Society of England and Wales (2019) Algorithms in the criminal justice system. https://www.lawsociety.org.uk/topics/research/algorithm-use-in-the-criminal-justice-system-report
Tolba AS, El-Baz AH, El-Harby AA (2006) Face recognition: a literature review. Int J Inf Commun Eng 2:88–103
Trigueros DS, Meng L, Hartnett M (2018) Face recognition: from traditional to deep learning methods. arXiv:1811.00116
Turner J (2019) Robot rules: regulating artificial intelligence. Palgrave Macmillan, London
Ubelaker DH, Shamlou A, Kunkle A (2019) Contributions of forensic anthropology to positive scientific identification: a critical review. Forensic Sci Res 4:45–50
Vapnik V (2013) The Nature of Statistical Learning Theory. Springer
Verbeek PP (2011) Moralizing technology: understanding and designing the morality of things. University of Chicago Press, Chicago
Waldron J (2012) The harm of hate speech. Harvard University Press
Wang M and Deng W (2020) Mitigating bias in face recognition using skewness-aware reinforcement learning. In proceedings of the IEEE/CVF conference on computer vision and pattern recognition. pp. 9322–9331
Wang M, Deng W (2021) Deep face recognition: a survey. Neurocomputing 429:215–244
Wang M et al (2019) Racial Faces in the wild: reducing racial bias by information maximization adaptation network. In proceedings of the IEEE international conference on computer vision. pp.692–702
Williford JR et al (2020) Explainable face recognition. In proceedings of the European conference on computer vision. pp. 248–263
Winner L (1977) Autonomous technology. MIT Press
Yin B et al (2019) Towards interpretable face recognition. In proceedings of the IEEE international conference on computer vision. pp. 9348–9357
Zarsky Tal Z (2014) Understanding discrimination in the scored society. Wash L Rev 89:1375–1412
Zee T et al (2019) Enhancing human face recognition with an interpretable neural network. In proceedings of the IEEE international conference on computer vision workshops. pp. 514–522
Acknowledgements
The authors would like to thank Dr Jiahong Chen, lecturer in Law School of Sheffield University, professor Haitao Yu, University of Tsukuba and Dr Fajie Yuan, University of Westlake, for their comments. Conference invitations from professor Kun Liang, Professor Qinghua Wang, professor Dengke Xie, associate professor Li Li and Dr Yana Li are acknowledged.
Funding
Funding was provided by Program for Young Innovative Research Team in China University of Political Science and Law and National Social Science Foundation ‘Research on the construction of criminal security risk prevention system in digital economy’ (21&ZD209). There are no financial or non-financial interests that are directly or indirectly related to the work submitted for publication.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Guo, Z., Kennedy, L. Policing based on automatic facial recognition. Artif Intell Law 31, 397–443 (2023). https://doi.org/10.1007/s10506-022-09330-x
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10506-022-09330-x