Published on 9 December 2020
Share Tweet  Share

Should the police be using facial recognition systems?

Some police forces in England and Wales - as in other countries - have started to pilot automated facial recognition systems. The technology has been controversial, however. Isabella Duan, in a new working paper, argues that greater safeguards against abuse are needed if the police are to avoid undermining trust in a technology which could in future help policing.

The use of live automated facial recognition (AFR) systems in England and Wales for law enforcement purposes has been the subject of criticism concerning the inadequacy of governance of the technology. And in August 2020, the Court of Appeal reviewing the case Bridges v. the Chief Constable of the South Wales Police and Others ruled that the force’s deployment of AFR did not have an adequate legal framework. In any case, while the law defines the ‘lawful’, it may not define the ‘ideal’, or the best state of affairs for society. An important question is: what would be the best policy framework for the use of AFR systems in policing?

I argue that the ‘ideal’ for AFR systems is that they are both trusted and trustworthy – that is, deserving of trust. I distinguish also between public trust and expert stakeholder trust. On the one hand, public trust in the use of the technology will depend on whether the system in which it is used more broadly trusted: do they trust the police in general? On the other hand, stakeholders place trust deliberatively and look for evidence justifying the use of the technology in terms of socially-desired outcomes, therefore determining whether  public trust is well-placed. A good governance framework, therefore, must ensure both public trust and stakeholder trust.

For the police’s deployment of the AFR systems in the England and Wales, the cultural history of policing by consent in the country may have meant public trust to innovative ways of policing is readily extended, whereas stakeholders are considering whether the technology justified in the context of the UK’s existing data protection governance. This has defined ‘necessity and proportionality’ as justification, which leaves considerable space for divergent interpretations of the phrase among stakeholders.

Drawing from interviews with 23 stakeholders from the government, industry, academia, civil society, and the police, as well as archive data, I analyze and contrast ways in which different stakeholders and the public decide to place trust (or not) in police’s use of AFR systems. There are different ways in which AFR could be evaluated but I find that the accumulation of evidence has not contributed to more understanding on AFR systems.

Indeed, there are at times distinctive and often conflicting assumptions, noticeably between the police and other stakeholders, on how to interpret evidence about results from using AFR and indeed who can interpret the results authoritatively. For example, the police tend to evaluate accuracy by the proportion of false-positive alerts in an estimated total number of recognition opportunities, as opposed to many researchers, who focus on the proportion of false-positive alerts in the total number of alerts generated by the AFR systems. Consequently, for the same trial, the police may claim that the system is 99% accurate, while researchers may assert that more than 70% of matches are incorrect. The absence of agreed standards for evaluation thus introduces conflicting evidence and misunderstandings.

The lack of stakeholder trust warns against licensing the police’s use of AFR systems, despite high public trust. The public trust may be misplaced – the evidence of AFR’s reliability is not clear. Furthermore, the current lack of stakeholder trust is likely to eventually contribute to the public’s negative perception of AFR systems and erode public trust.  The governance framework for AFR needs urgently to coordinate grounded experience from the police, expertise from social scientists, along with deliberative perspectives from the public. While the UK government is reviewing the governance framework and considering its simplification and extension, it needs to act quickly to keep up with the diffusion of the technology.

Working paper: “Governing Live Automated Facial Recognition Systems for Policing in England and Wales” 


The views and opinions expressed in this post are those of the author(s) and not necessarily those of the Bennett Institute for Public Policy.

Authors

Isabella Duan

Isabella Duan is a Laidlaw scholar and a BSc Philosophy, Politics, Economics student at University College London. Her research, supervised by Professor Diane Coyle at the Institute, investigates the ideal...

Back to Top