Skip to main content

Advertisement

Log in

Policing based on automatic facial recognition

  • Original Research
  • Published:
Artificial Intelligence and Law Aims and scope Submit manuscript

Abstract

Advances in technology have transformed and expanded the ways in which policing is run. One new manifestation is the mass acquisition and processing of private facial images via automatic facial recognition by the police: what we conceptualise as AFR-based policing. However, there is still a lack of clarity on the manner and extent to which this largely-unregulated technology is used by law enforcement agencies and on its impact on fundamental rights. Social understanding and involvement are still insufficient in the context of AFR technologies, which in turn affects social trust in and legitimacy and effectiveness of intelligent governance. This article delineates the function creep of this new concept, identifying the individual and collective harms it engenders. A technological, contextual perspective of the function creep of AFR in policing will evidence the comprehensive creep of training datasets and learning algorithms, which have by-passed an ignorant public. We thus argue individual harms to dignity, privacy and autonomy, combine to constitute a form of cultural harm, impacting directly on individuals and society as a whole. While recognising the limitations of what the law can achieve, we conclude by considering options for redress and the creation of an enhanced regulatory and oversight framework model, or Code of Conduct, as a means of encouraging cultural change from prevailing police indifference to enforcing respect for the human rights violations potentially engaged. The imperative will be to strengthen the top-level design and technical support of AFR policing, imbuing it with the values implicit in the rule of law, democratisation and scientisation-to enhance public confidence and trust in AFR social governance, and to promote civilised social governance in AFR policing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. R (Bridges) v CCSWP and SSHD [2019] EWHC 2341 (Admin).

  2. Ibid. [7].

  3. R (Bridges) v CC South Wales & ors [2020] EWCA Civ 1058.

  4. The full text of the draft is available at:

    https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52021PC0206.

  5. ‘Detecting faces in an image’:

    https://docs.aws.amazon.com/rekognition/latest/dg/faces-detect-images.html.

  6. Hallard, Bobby (27 February 2020) Clearview AI client list hacked:

    https://www.itpro.com/security/data-breaches/354866/clearview-ai-client-list-hacked.

  7. R (Bridges) (n 1) [2019] [33].

  8. R(Wood) (n 6) [20]-[21]; Re JR (n 6) [86].

  9. R (Bridges) (n 1) [52].

  10. European Convention for the Protection of Human Rights and Fundamental Freedoms, 3 Sept. 1953, ETS 5, 213 UNTS 221.

  11. R(Wood) v Commissioner of Police of the Metropolis [2009] EWCA Civ 414, [2009] 4 All ER 951 [22]; Re JR 38 [2016] AC 1131 [86].

  12. R (Bridges) (n 1) [51].

  13. This doctrine was established in von Hanover v Germany (2004) 40 EHRR 1. See also Campbell v MGN [2004] UKHL 22.

  14. Perry v United Kingdom (2004) 39 EHRR 3 [38]; PG v United Kingdom (2008) 46 EHRR 51 [57]; Lopez Ribalda v Spain (2020) 71 EHRR 7 [89].

  15. R (Bridges) (n 1) [60].

  16. R(Wood) (n 6) [22]; Re JR (n 6) [86].

  17. R (Bridges) (n 1) [51].

  18. R(Wood) (n 6) [36]-[37].

  19. R (Bridges) (n 1) [57].

  20. S v United Kingdom (2009) 48 EHRR 50 [67]. In the case of interception of communications, the Strasbourg Court considered the initial collection, retention and subsequent use of the relevant information as a separate interference with Article 8. Amann v Switzerland (2000) 30 EHRR 843 [GC] [48], [69].

  21. R (Bridges) (n 1) [59].

  22. Rynes v Urad [2015] 1 WLR 2607 [22].

  23. R (Bridges) (n 3) [85]-[89].

  24. Identification should state that an AFR camera is in use; that it is processing biometric data; and that the data is being processed by the police, to achieve a particular established purpose.

  25. R (Bridges) (n 1) [39].

  26. Ibid. [40]; R (Bridges) (n 3) [20].

  27. R(T) v Chief Constable of Greater Manchester[2015] AC 49 [88]-[89].

  28. Consider Gaughran v United Kingdom [2020] 2 WLUK 607; The Times 22 April 2020 (ECHR).

  29. Stephanie Kanowitz, Facial recognition inappropriate for high-risk applications, experts say (25 March 2022):

    https://gcn.com/emerging-tech/2022/03/facial-recognition-inappropriate-high-risk-applications-experts-say/363603/.

  30. See also similarly the Regulation of Investigatory Powers (Scotland) Act 2000.

  31. United States v Mendenhall 446 US 544 (1980); INS v Delgado 466 US 210 (1984).

  32. Kyllo v United States, 533 US 27 (2001).

  33. Mollett v State, 939 P 2d 1, 11 (Okla Crim App 1997).

  34. United States v Dionisio, 410 US 1, 14 (1973).

  35. See the Explanatory Report accompanying the Protocol of the Council of Europe Convention for the Protection of Individuals with regard to the Automatic Processing of Personal Data [59].

  36. When retail stores use image processing technologies that compile basic demographics (gender, race, and skin tones), they are processing sensitive data (Slane 2021). This kind of facial analysis ‘can give rise to its own problems of bias and inaccuracies’ (Chan 2021).

  37. R (Bridges) (n 1) [133]; s 35(8)(b) of the DPA 2018.

  38. Ibid. [37].

  39. R (Bridges) (n 3) [91],[96].

  40. Ibid. [13].

  41. Ibid. [121]-[125]; R (Bridges) (n 1) [30].

  42. Para 1 of Schedule 8 to DPA 2018.

  43. R (Bridges) (n 1) [136], [137].

  44. Ibid. [99].

  45. Ibid. [102].

  46. Ibid. [104].

  47. Ibid. [103].

  48. R (Bridges) (n 3) [140]-[143].

  49. R (Bridges) (n 1) [101].

  50. R (Bridges) (n 3) [60].

  51. Ibid. [16].

  52. R (Bridges) (n 1) [31].

  53. Ibid. [96].

  54. Secretary of State’s Surveillance Camera Code of Practice, ‘Guiding Principles’ 1, 5, 11, and paras 1.8, 2.4.

  55. See ss 35(5) and 42(2) of DPA 2018.

  56. R (Bridges) (n 1) [139].

  57. Ibid. [140], [141].

  58. Ibid. [139]; R(Unison) v Lord Chancellor [2016] ICR1 [106].

  59. R (Bridges) (n 1) [148].

  60. R (Bridges) (n 3) [152]-[153].

  61. In the exercise of their functions, public authorities must give due consideration to three matters: the elimination of discrimination, harassment, persecution and any other acts prohibited by the Equality Act, and the promotion of equal opportunities and good relations between persons with relevant protected characteristics and persons who do not. See s 149(1).

  62. More on fair justice, see Rawls (1971). More on absolutely just services of British police, see FOI (2012).

  63. R (Bridges) (n 3) [176], [179], [181]. MPS claims that its AFR uses ‘the latest accurate algorithm’. Metropolitan Police ‘Live Facial Recognition’:

    https://www.met.police.uk/advice/advice-and-information/facial-recognition/live-facial-recognition/. That in itself will not necessarily fulfil the positive obligation of assuring no bias.

  64. R (Bridges) (n 1) [153].

  65. R (Bridges) (n 3) [198]-[201].

  66. Ibid. [176].

  67. Ibid. [206].

References

Download references

Acknowledgements

The authors would like to thank Dr Jiahong Chen, lecturer in Law School of Sheffield University, professor Haitao Yu, University of Tsukuba and Dr Fajie Yuan, University of Westlake, for their comments. Conference invitations from professor Kun Liang, Professor Qinghua Wang, professor Dengke Xie, associate professor Li Li and Dr Yana Li are acknowledged.

Funding

Funding was provided by Program for Young Innovative Research Team in China University of Political Science and Law and National Social Science Foundation ‘Research on the construction of criminal security risk prevention system in digital economy’ (21&ZD209). There are no financial or non-financial interests that are directly or indirectly related to the work submitted for publication.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhilong Guo.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Guo, Z., Kennedy, L. Policing based on automatic facial recognition. Artif Intell Law 31, 397–443 (2023). https://doi.org/10.1007/s10506-022-09330-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10506-022-09330-x

Keywords

Navigation