Border Security, Asylum and Immigration Bill

Debate between Chris Philp and Iqbal Mohamed
Chris Philp Portrait Chris Philp
- Hansard - -

I agree with both my colleagues, and that is why we have tabled amendments and new clauses to address this issue. I will come on to those in a moment.

It was a Labour Government that chose to cancel the removals deterrent before it started, and that is why the numbers are higher than they have ever been in history. It is a result of their choices.

Iqbal Mohamed Portrait Iqbal Mohamed (Dewsbury and Batley) (Ind)
- Hansard - - - Excerpts

Talking of the Rwanda scheme, the previous Tory Government spent £700 million on a scheme that saw four volunteers removed. That figure included £290 million given to Rwanda for nothing in return and £134 million on IT systems that were never used. Can we get a refund?

Chris Philp Portrait Chris Philp
- Hansard - -

As I said already, the plan was never started. The first plane was due to take off on 24 July, but the Labour Government cancelled it within days of coming to office. The money would have been extremely well spent had the scheme started, because the deterrent effect would have stopped the boats, meaning that we would not have tens of thousands of people in hotels costing billions and billions.

While we are on the topic of hotels, let us look at how the Labour Government’s pledge during the election to end the use of asylum hotels is going. The numbers in asylum hotels have gone up by 8,000 so far under this Labour Government. Speaking of removals deterrents, I was in Berlin four or five weeks ago talking to members of the CDU party, which is now in Government. The incoming German Government intend to implement a removals deterrent very similar in concept to the Rwanda scheme. So other Governments around the world have realised that they have to do this; it worked in Australia, and the new German Government will be doing something very similar. It is just our Government who are going headlong in the opposite direction.

Facial Recognition: Police Use

Debate between Chris Philp and Iqbal Mohamed
Wednesday 13th November 2024

(6 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Chris Philp Portrait Chris Philp (Croydon South) (Con)
- Hansard - -

It is a pleasure, as always, to serve under your chairmanship, Dame Siobhain. I congratulate my right hon. Friend the Member for Maldon (Sir John Whittingdale) on securing the debate and on the characteristically thoughtful manner in which he approached his speech.

I think this is the first time that I have appeared opposite the new Minister for Policing, Fire and Crime Prevention—the job that I was doing until a few months ago—so let me congratulate her on her appointment. Although I will of course hold the Government to account, I will do everything I can to constructively support her in making a great success of the job, and I really do wish her well in the role.

I want to start by reminding colleagues of the way that live facial recognition works. It is different from retrospective facial recognition, which we have not debated today and, in the interests of time, I do not propose to go into. As some Members have already said, live facial recognition starts with a watchlist of people who are wanted by the police. It is not the case that anyone can get on that watchlist, which generally comprises people who are wanted for criminal offences—often very serious offences—people who have failed to attend court, and people who are registered sex offenders, where the police want to check that they are complying with their conditions. As people walk down a high street, they are scanned, typically by a CCTV camera on a mobile van, and then compared to the watchlist. The vast majority of people are not on the watchlist, as we would expect, and their image is immediately and automatically deleted. Where a person is on the watchlist, the police will stop them and ask if they have any form of identification.

To be very clear, no one gets convicted on the basis of that facial recognition match, so it is not overturning the presumption of innocence, and if it turns out that the person stopped is not the person on the watchlist, obviously they can continue on their way. However, if they are the person on the watchlist, a normal criminal investigation will follow, with the normal standards of evidence.

Iqbal Mohamed Portrait Iqbal Mohamed
- Hansard - - - Excerpts

On the point about the automatic deletion of data, there are many examples, but the one I can remember is Google incognito browsing mode. That was meant to be very private—only you saw where you went—but Google was found to be storing that data, and it has been legally challenged and prosecuted for breaching the GDPR or other privacy laws. Companies may say that things are immediately deleted, but it is not always true.

Chris Philp Portrait Chris Philp
- Hansard - -

That is a good point; we must ensure that the operating procedures are adhered to, and I will come on to that a little later. However, to be absolutely clear, if someone is identified as a match, a normal criminal investigation is conducted to normal criminal standards. Nobody is convicted on the basis of this evidence alone—or, indeed, on the basis of this evidence at all.

Let me come to the question about racial disparity. When this technology was first introduced, about seven years ago, there were reports—accurate reports—that there was racial bias in the way that the algorithm operated. The algorithm has been developed a great deal since those days, and it has been tested definitively by the national physical laboratory, the nation’s premier testing laboratory. NPL testing is the gold standard of testing and this technology has been tested relatively recently. For the benefit of Members, I will read out what the results of that testing were:

“The NPL study found that, when used at the settings maintained by the Met”—

that is the 0.6 setting that the hon. Member for Brent East (Dawn Butler) referred to earlier—

“there was no statistically significant difference in the facial recognition technology’s accuracy across”

different demographic groups. In other words, the technology as it is being used today—not five years ago, when there were issues—has been certified by the NPL and it has been found that there is not any racial bias at the settings used.