Facial Recognition: Police Use Debate

Full Debate: Read Full Debate
Department: Home Office

Facial Recognition: Police Use

Bell Ribeiro-Addy Excerpts
Wednesday 13th November 2024

(3 days, 18 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Bell Ribeiro-Addy Portrait Bell Ribeiro-Addy (Clapham and Brixton Hill) (Lab)
- Hansard - -

Thank you, Dame Siobhain, for your merciful chairpersonship. I thank the right hon. Member for Maldon (Sir John Whittingdale) for introducing this crucial debate.

Like many others, I have many concerns about live facial recognition technology, some of which have already been raised, but I will focus my remarks on the room for error and the potential impact that this technology will have on already dwindling public trust in police, particularly among black, Asian and ethnic minority citizens. I will raise points similar to those of my hon. Friends the Members for Liverpool Riverside (Kim Johnson) and Brent East (Dawn Butler).

Live facial recognition technology compares live CCTV images with those already on the police database and other images taken from open source, publicly available image sites. This is a deeply flawed plan that could result in serious mix-ups. A simple mislabelling on an image database could lead to the wrong person being stopped and a potentially traumatic experience with the police.

I can illustrate my point with a short anecdote; this happened to me a mere few months after I was elected to this House. My hon. Friend the Member for Battersea (Marsha De Cordova) was speaking in the Chamber. BBC Parliament miscaptioned her as my hon. Friend the Member for Brent East and, when they spotted this, both Members took to Twitter to point out the mistake. In their haste to cover the story, the Evening Standard incorrectly used a picture of me instead of my hon. Friend the Member for Battersea—I hope everybody is following this—and in its apology to all three of us, it suggested that Getty Images, where they had taken the image from, had labelled most of the pictures of me, since I had been elected, with the name of my hon. Friend the Member for Battersea. Since then, to avoid embarrassment, it seems that most publications now use pictures of me looking like a constipated walrus, but they have said that their reason for this is that they can be sure it is me and they want to avoid any further embarrassment.

Although problematic, that is a far more trivial example of what can happen when images are mislabelled, but if humans can make these errors, the technologies they create obviously can. If online sources are going to be used as part of the image database, it is almost inevitable that images will be mislabelled and that innocent people will be subject to needless run-ins with the police.

Questions around the numerical similarity score used to determine matches also ought to be raised. We already know that facial recognition data has racial bias: it is deeply flawed when attempting to identify people with darker skin tones, just as Getty Images is, and the Metropolitan police’s own testing of its facial recognition algorithm identified disproportionately higher inaccuracy rates when attempting to identify people of colour and women.

People of colour are already disproportionately stopped and searched at higher rates, and the use of potentially flawed technology will serve only to increase the rate at which ethnic minorities are stopped, searched and possibly even incorrectly detained, further dampening trust in the police among these communities. We know that that needs to be resolved. To any Member who thinks that I am exaggerating the potential for misidentification, I say this: in 2023, Big Brother Watch found that over 89% of UK police facial recognition alerts wrongly identified members of the public as people of interest. In that case, what benefits does this technology bring? It has been used in the borough of Lambeth, including in my own constituency, on a number of occasions, but as far as I am aware it has not produced a substantial number of results. Our constituents are effectively being placed under constant surveillance. The notion of their presumed innocence, which sits at the heart of our justice system, has been undermined, and this “cutting-edge” technology has not produced substantial results.

With some 6 million CCTV cameras in the UK, which all have the potential to be converted into facial recognition cameras, we are veering dangerously close to becoming a police state with levels of surveillance that would be deemed acceptable only in the most authoritarian of dictatorships. I believe that our liberty and our security can co-exist. It is not a matter of “those who have nothing to hide have nothing to fear”; it is a matter of the basic principles of freedom and privacy. Those basic principles begin to draw into question what such surveillance is really here for. Is it here to keep us safe or to monitor us 24/7?

Most Members would undoubtedly, I hope, protest at the idea of police randomly stopping members of the public to check their fingerprints or other DNA against databases just for a possible match. Why should we look at this intrusive automated biometric software any differently?

--- Later in debate ---
Bobby Dean Portrait Bobby Dean
- Hansard - - - Excerpts

I thank the hon. Gentleman for his intervention, but the evidence is quite clear in this area. Somebody might watch this debate and have doubts, but the research is quite clear.

Bell Ribeiro-Addy Portrait Bell Ribeiro-Addy
- Hansard - -

Further to the point made by the hon. Member for South Basildon and East Thurrock (James McMurdock), just about every time that somebody has stated that there are issues of racial discrimination with this technology, they have cited sources that people can look at. For the benefit of both the public and the hon. Member, it is important to note that these are not just assumptions; they are based on data and evidence. There is further evidence we could give, such as my personal experience and the experiences of others, but those specific points were made with evidence.

Bobby Dean Portrait Bobby Dean
- Hansard - - - Excerpts

I agree with the hon. Member that some evidence has been cited in the Chamber today, but there is other evidence that we can look at. Let us not forget that the technology exacerbates the known problem—particularly with the Met police in London, where I live—that black communities feel over-policed and underserved. That has built up over time, and the use of this technology could exacerbate that problem further.

The hon. Member for Leicester South (Shockat Adam) made a comment about how polite conversations do not always register as polite conversations. That is because of the persistence of those conversations over time. A repeated polite conversation starts to become an aggressive conversation to the person on the receiving end, if it is that persistent. There was also discussion about the findings of the national physical laboratory, but it is clear that those findings are disputed—[Interruption.] Well, it is clear that they are disputed; they have been disputed in the Chamber today. Until we get to the bottom of that, we need to think carefully about the controls that we have in relation to discrimination.

I want to talk about the general principle of privacy. As a liberal, I feel a general depression about how we have come to devalue privacy in society, and how we trade it away far too readily for other societal aims. We often hear the claim, “If you’re not doing anything wrong then there’s nothing to worry about,” as if the only value of privacy were to hide things that someone might be doing wrong. That is not the case. Privacy delivers so much more than that. It delivers personal wellbeing and gives people control over their own data. It allows us to have freedom of association and dignity. We need to think very carefully before we so readily trade away the principle of privacy in pursuit of other goals in society.

The opportunity for slippage has been discussed at length. One would think that such technology would come with strict controls, but it is clear that at the moment we have the opposite; in fact, Big Brother Watch has described it as a “legal vacuum”. The hon. Member for Brighton Pavilion (Siân Berry) talked about the creeping expansion of its use in London. I have seen that myself; what started off being limited to large-scale events, such as football matches, has turned into routine trials on high streets, such as mine in Sutton.

We have also seen expansion in the photos that are used. The technology started off using only photographs of people known to the police, for good reason, but it has been expanded to potentially including everyone who has a passport or driving licence photo. What started being strictly about warrant breakers and sex offenders could expand to be about pretty much anything the Government of the day want. If we think about the clampdown on protest under the previous Government, that potentially has a chilling impact on the right to freedom of association.

With all of those doubts, it is clear that we need proper parliamentary consideration of the issue. The Lib Dems ask the Minister to immediately halt the roll-out of live facial recognition technology until we get it right. It should be down to this place to determine the correct controls and whether there is a legitimate use of the technology at all, given all the concerns about discrimination and privacy. Privacy is a fundamental civil liberty. We have undervalued it far too much in recent times. This is an opportunity to protect it, and we should take it.