(1 month, 1 week ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure, as always, to serve under your chairmanship, Dame Siobhain. I congratulate my right hon. Friend the Member for Maldon (Sir John Whittingdale) on securing the debate and on the characteristically thoughtful manner in which he approached his speech.
I think this is the first time that I have appeared opposite the new Minister for Policing, Fire and Crime Prevention—the job that I was doing until a few months ago—so let me congratulate her on her appointment. Although I will of course hold the Government to account, I will do everything I can to constructively support her in making a great success of the job, and I really do wish her well in the role.
I want to start by reminding colleagues of the way that live facial recognition works. It is different from retrospective facial recognition, which we have not debated today and, in the interests of time, I do not propose to go into. As some Members have already said, live facial recognition starts with a watchlist of people who are wanted by the police. It is not the case that anyone can get on that watchlist, which generally comprises people who are wanted for criminal offences—often very serious offences—people who have failed to attend court, and people who are registered sex offenders, where the police want to check that they are complying with their conditions. As people walk down a high street, they are scanned, typically by a CCTV camera on a mobile van, and then compared to the watchlist. The vast majority of people are not on the watchlist, as we would expect, and their image is immediately and automatically deleted. Where a person is on the watchlist, the police will stop them and ask if they have any form of identification.
To be very clear, no one gets convicted on the basis of that facial recognition match, so it is not overturning the presumption of innocence, and if it turns out that the person stopped is not the person on the watchlist, obviously they can continue on their way. However, if they are the person on the watchlist, a normal criminal investigation will follow, with the normal standards of evidence.
On the point about the automatic deletion of data, there are many examples, but the one I can remember is Google incognito browsing mode. That was meant to be very private—only you saw where you went—but Google was found to be storing that data, and it has been legally challenged and prosecuted for breaching the GDPR or other privacy laws. Companies may say that things are immediately deleted, but it is not always true.
That is a good point; we must ensure that the operating procedures are adhered to, and I will come on to that a little later. However, to be absolutely clear, if someone is identified as a match, a normal criminal investigation is conducted to normal criminal standards. Nobody is convicted on the basis of this evidence alone—or, indeed, on the basis of this evidence at all.
Let me come to the question about racial disparity. When this technology was first introduced, about seven years ago, there were reports—accurate reports—that there was racial bias in the way that the algorithm operated. The algorithm has been developed a great deal since those days, and it has been tested definitively by the national physical laboratory, the nation’s premier testing laboratory. NPL testing is the gold standard of testing and this technology has been tested relatively recently. For the benefit of Members, I will read out what the results of that testing were:
“The NPL study found that, when used at the settings maintained by the Met”—
that is the 0.6 setting that the hon. Member for Brent East (Dawn Butler) referred to earlier—
“there was no statistically significant difference in the facial recognition technology’s accuracy across”
different demographic groups. In other words, the technology as it is being used today—not five years ago, when there were issues—has been certified by the NPL and it has been found that there is not any racial bias at the settings used.