Facial Recognition: Police Use Debate

Full Debate: Read Full Debate
Department: Home Office

Facial Recognition: Police Use

Chris Philp Excerpts
Wednesday 13th November 2024

(1 month ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I agree with my right hon. Friend. The problem at the moment is that we do not even have national guidelines. There is a complete absence, which I will come to later. I will give way to the shadow Home Secretary.

Chris Philp Portrait Chris Philp (Croydon South) (Con)
- Hansard - -

I am extremely grateful to my right hon. Friend for giving way. I would like to add some context to the question of racial bias. There were allegations of racial bias a few years ago. The system was tested by the national physical laboratory about two years ago and, at the settings used by the police, no racial bias was found. That was one of the conditions set in the Bridges litigation about four years ago, and I hope that gives my right hon. Friend and other hon. Members some reassurance on the question of racial bias. It has been tested by the national physical laboratory.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As I understand it, the number of false positives recorded depends to some extent on the threshold at which the technology is set.

--- Later in debate ---
Dawn Butler Portrait Dawn Butler (Brent East) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Dame Siobhain. I thank the right hon. Member for Maldon (Sir John Whittingdale) —or I could say my right hon. Friend, if he does not mind—for securing this debate. I have spoken to the Secretary of State and Ministers in the Department for Science, Innovation and Technology, and there is an awareness that we need a lot of careful and considerate thinking on this issue. Obviously, a new Government have just come in and this is not a new issue, as the right hon. Member for Maldon said—LFR was first used in 2017, so there is a lot of clearing up that has to be done.

Live facial recognition changes one of the cornerstones of our democracy: an individual is innocent until proven guilty. With this technology, if the machine says an individual is guilty because they have been identified using live facial recognition, they then have to prove their innocence. That is a huge change in our democracy that nobody has consented to. We have not consented to it in this place, and as we police by consent as a society, that should really worry us all.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Lady for giving way; I am looking forward to this debate and to concluding it for the Opposition later.

On the question of changing the burden of proof or undermining the concept that someone is innocent until proven guilty, the technology absolutely does not change that. What it does is give the police a reason to stop somebody and check their identity to see whether they are the person wanted for a criminal offence. It certainly does not provide evidence on which a conviction might be secured. In fact, it is no different from the police stopping someone because they are suspicious of them, and it is a lot more accurate than stop and search, about which I am sure the hon. Lady has views. It is simply a tool to enable the police to stop somebody and check their identity to see whether they are the person who is wanted. It certainly does not undermine the very important principle that a person is innocent until proven guilty.

Dawn Butler Portrait Dawn Butler
- Hansard - - - Excerpts

The shadow Minister has hit on an important point regarding reasonable suspicion. What is reasonable suspicion? How have the police got to that point? If he is then going to make reference to watchlists, who is put on a watchlist? We know, for instance, that the Met police has hundreds of thousands of people on its system who should not be there. We know that the watchlist can consist of people it considers to be vulnerable, such as those with mental health issues. Anybody in this room could be put on a watchlist, so I am afraid the shadow Minister has not quite nailed the point he was trying to make.

--- Later in debate ---
Chris Philp Portrait Chris Philp (Croydon South) (Con)
- Hansard - -

It is a pleasure, as always, to serve under your chairmanship, Dame Siobhain. I congratulate my right hon. Friend the Member for Maldon (Sir John Whittingdale) on securing the debate and on the characteristically thoughtful manner in which he approached his speech.

I think this is the first time that I have appeared opposite the new Minister for Policing, Fire and Crime Prevention—the job that I was doing until a few months ago—so let me congratulate her on her appointment. Although I will of course hold the Government to account, I will do everything I can to constructively support her in making a great success of the job, and I really do wish her well in the role.

I want to start by reminding colleagues of the way that live facial recognition works. It is different from retrospective facial recognition, which we have not debated today and, in the interests of time, I do not propose to go into. As some Members have already said, live facial recognition starts with a watchlist of people who are wanted by the police. It is not the case that anyone can get on that watchlist, which generally comprises people who are wanted for criminal offences—often very serious offences—people who have failed to attend court, and people who are registered sex offenders, where the police want to check that they are complying with their conditions. As people walk down a high street, they are scanned, typically by a CCTV camera on a mobile van, and then compared to the watchlist. The vast majority of people are not on the watchlist, as we would expect, and their image is immediately and automatically deleted. Where a person is on the watchlist, the police will stop them and ask if they have any form of identification.

To be very clear, no one gets convicted on the basis of that facial recognition match, so it is not overturning the presumption of innocence, and if it turns out that the person stopped is not the person on the watchlist, obviously they can continue on their way. However, if they are the person on the watchlist, a normal criminal investigation will follow, with the normal standards of evidence.

Iqbal Mohamed Portrait Iqbal Mohamed
- Hansard - - - Excerpts

On the point about the automatic deletion of data, there are many examples, but the one I can remember is Google incognito browsing mode. That was meant to be very private—only you saw where you went—but Google was found to be storing that data, and it has been legally challenged and prosecuted for breaching the GDPR or other privacy laws. Companies may say that things are immediately deleted, but it is not always true.

Chris Philp Portrait Chris Philp
- Hansard - -

That is a good point; we must ensure that the operating procedures are adhered to, and I will come on to that a little later. However, to be absolutely clear, if someone is identified as a match, a normal criminal investigation is conducted to normal criminal standards. Nobody is convicted on the basis of this evidence alone—or, indeed, on the basis of this evidence at all.

Let me come to the question about racial disparity. When this technology was first introduced, about seven years ago, there were reports—accurate reports—that there was racial bias in the way that the algorithm operated. The algorithm has been developed a great deal since those days, and it has been tested definitively by the national physical laboratory, the nation’s premier testing laboratory. NPL testing is the gold standard of testing and this technology has been tested relatively recently. For the benefit of Members, I will read out what the results of that testing were:

“The NPL study found that, when used at the settings maintained by the Met”—

that is the 0.6 setting that the hon. Member for Brent East (Dawn Butler) referred to earlier—

“there was no statistically significant difference in the facial recognition technology’s accuracy across”

different demographic groups. In other words, the technology as it is being used today—not five years ago, when there were issues—has been certified by the NPL and it has been found that there is not any racial bias at the settings used.

Dawn Butler Portrait Dawn Butler
- Hansard - - - Excerpts

But when we look at the numbers of people, something like 0.5% of scans—I cannot remember the statistic—still result in somebody being misidentified.

Chris Philp Portrait Chris Philp
- Hansard - -

On the misidentification rate, I think the Bridges court case set a standard of a false positive rate of one in 1,000: out of every 1,000 people stopped, 999 are the people the police think they are, while one is misidentified. The Minister may have more up-to-date figures, but from my recollection the system in practice is running at about one in 6,000. That is an extraordinarily high accuracy rate—much more accurate than a regular stop and search.

About 25% to 30% of regular physical stops and searches, where a police officer stops someone and searches them for drugs or a knife or something, are successful. About 70% are unsuccessful, while the equivalent figure for live facial recognition is 0.02%. That means that this technology is 4,500 times less likely to result in someone being inappropriately stopped than a regular stop and search. It therefore hugely—by three orders of magnitude—reduces the likelihood of someone being improperly stopped and searched.

I turn to the use of the technology on the ground. I asked for it to be trialled in the centre of Croydon, which is the borough I represent in Parliament. Over the past nine months or so, it has been deployed on a relatively regular basis: about once a week. I believe that the Minister was supposed to go down this morning to have a look; I certainly encourage her to go again as soon as she can. By the way, the hon. Member for Birmingham Perry Barr (Ayoub Khan) asked whether people know when the technology is being used. The answer is yes: one of the guidelines is that public signage must be displayed telling the public that the technology is in use.

Over that period in Croydon, there have been approximately 200 arrests of people who would not otherwise have been arrested, including for all kinds of offences such as class A drugs supply, grievous bodily harm, fraud and domestic burglary. It has also included a man who had been wanted for two rapes dating back to 2017. That wanted rapist would be free to this day if not for this technology. Just a couple of weeks ago, a man was stopped and subsequently arrested in relation to a rape allegation from June this year. There are people who are alleged to have committed rape who would not have been stopped—who would still be walking free—if not for this technology. It is only the fact that they walked past a camera outside East Croydon station or somewhere that has meant they were stopped by the police. They will now have a normal trial with the normal standards of evidence, but they would not have been caught in the first place if not for this technology.

I have done quite a lot of public meetings on this. I explain, “These are the people who get caught, and the price the public pay is that you might get scanned when you walk down Croydon High Street, but if you are innocent your picture is immediately deleted.” By and large, the overwhelming majority of the people in Croydon think that a reasonable trade-off.

There should be protections, of course. Several hon. Members, including my right hon. Friend the Member for Maldon, have rightly said that there should be guidelines, rules and procedures. However, it is not true that there is a complete vacuum as far as rules and regulations are concerned. The Bridges case at the Court of Appeal in 2020 looked at how South Wales police were using the technology between 2017 and 2020. It found that some of the ways they were using the technology were not appropriate because they broke rules on things like data protection privacy. It set out in case law the guidelines that have to be adhered to for the technology to be lawful—things like public signage, the rate of accuracy and having no racial bias.

Secondly—I do hope I am not taking the Minister’s entire speech—there are guidelines for police. The College of Policing has national authorised professional practice guidelines that the police are supposed to stick to. There is a debate to be had about whether, for the sake of clarity and democratic accountability, we in Parliament should set something out more formal; my right hon. Friend the Member for Maldon made that point. I think there would be some merit in clarifying at a national level where the guidelines sit, but I would not go as far as Europe. If we had done so, those rapists would not have been arrested. I would also be careful to ensure that any legislation is flexible enough to accommodate changing technology. Primary legislation may not be the right vehicle: a regulation-making power might be a more sensible approach, so that things can be kept up to date from time to time.

While we consider that, I strongly urge the Minister not to halt the use of the technology. As we speak, it is arresting criminals in Croydon and elsewhere who would not otherwise be caught. I urge her to continue supporting the police to roll it out. I think some money was allocated in the Budget for the current financial year, to continue developing the technology. I would welcome an update from the Minister on whether that money is still being spent in the current financial year. I do hope it has not somehow been snaffled by the Treasury in a misguided cost-saving effort—

Siobhain McDonagh Portrait Dame Siobhain McDonagh (in the Chair)
- Hansard - - - Excerpts

Order. I apologise for interrupting the shadow Secretary of State, but I am looking at the time. I am sure hon. Members would like to hear from the Minister.

Chris Philp Portrait Chris Philp
- Hansard - -

None more so than me. I will conclude by saying that this is an important technology: it takes people off the streets who would otherwise not be caught. The Minister has my support in continuing its roll-out and deployment.