1 Iqbal Mohamed debates involving the Home Office

Facial Recognition: Police Use

Iqbal Mohamed Excerpts
Wednesday 13th November 2024

(3 days, 18 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Iqbal Mohamed Portrait Iqbal Mohamed (Dewsbury and Batley) (Ind)
- Hansard - -

It is a pleasure to serve under your chairship, Dame Siobhain. I thank the right hon. Member for Maldon (Sir John Whittingdale) for securing this important debate.

I have researched this subject and listened to hon. Members’ contributions, and it has been frankly shocking to learn that LFR has been in use since 2017 without any specific legislation in place to control its use and protect our civil liberties. That is seven years too many without legislation. Although I agree that the use of real-time facial recognition in the United Kingdom promises enhanced security and efficiency, it also raises significant legal and moral concerns, and there are severe adverse consequences for our society.

As a former software test manager, I am extremely concerned that private companies that profit from their technology are allowed to self-regulate and to confirm the efficacy of the products that they sell, and that the police are guided by those companies in how to use the tools and rely on the companies’ reports of their efficacy to take legal action against innocent civilians.

The technology operates by capturing and analysing highly sensitive and personal biometric data. As has been mentioned, the legal framework for its use is complex and at times insufficient. The Data Protection Act 2018 and the General Data Protection Regulation provide some safeguards, requiring data processing to be fair, necessary and proportionate. However, the lack of specific legislation for facial recognition technology leaves huge room for misuse and overreach.

The deployment of this technology without explicit consent undermines several of our fundamental rights, some of which have been mentioned. The first is the right to privacy: constant surveillance and the collection of biometric data without explicit consent infringe an individual’s privacy rights. This is particularly concerning when the technology is used in public spaces without people’s knowledge. The second right is the right to freedom of peaceful assembly and expression. The use of facial recognition can deter individuals from participating in protests or public gatherings due to the fear of being monitored or identified. This undermines the fundamental right to assemble and express opinions freely.

The third right is the right to non-discrimination. As has been mentioned, facial recognition systems have been shown to have higher error rates for people of colour, women and younger individuals. This bias can lead to disproportionate targeting and wrongful arrests, exacerbating existing inequalities and discrimination. The final right is the right to data protection. The collection, storage and processing of biometric data must comply with data protection laws. Inadequate safeguards can lead to unauthorised access and misuse of personal data.

My hon. Friend the Member for Leicester South (Shockat Adam) cited examples of how this technology is used in Russia and China, and we know that it is used extensively in Israel as part of its apartheid regime and occupation of the Palestinian people. Violations highlight the need for strict regulation and oversight to ensure that the deployment of facial recognition technology does not infringe fundamental human rights. The technology subjects individuals to constant surveillance, often without their knowledge, eroding trust in public institutions. The ethical principle of autonomy is compromised when people are unaware that their biometric data is being collected and analysed.

Let me cite some examples of the technology’s inefficacy and unreliability. In 2020, the Court of Appeal found that South Wales police’s use of facial recognition technology was unlawful, and that the force had breached privacy rights and failed to adequately assess the risks to individual freedoms. The technology’s accuracy is not infallible: misidentifications can lead to miscarriages of justice, where innocent individuals are wrongly accused or detained.

The disproportionate impact of FR technology on black people and people of colour is particularly concerning. Research has consistently shown that these systems are more likely to misidentify individuals from those groups. For example, a National Institute of Standards and Technology study—I do not know how old it is—found that FR algorithms were up to 100 times more likely to misidentify black and Asian faces than white faces. This disparity not only undermines the technology’s reliability, but perpetuates systemic racism. In practice, this means that black people and people of colour are more likely to be subjected to unwanted surveillance and scrutiny, which can lead to a range of negative outcomes.

There are other examples of miscarriages of justice and misuse. In one instance, the Metropolitan police used FR technology at the Notting Hill carnival, leading to the wrongful identification and harassment of innocent individuals. These and the other examples cited by hon. Members underscore the potential for significant harm when this technology is deployed without adequate safeguards.

In conclusion, although facial recognition technology offers potential benefits, its deployment must be carefully regulated to prevent misuse and protect individual rights. The legal framework needs to be strengthened to ensure that the use of technology is transparent, accountable and subject to rigorous oversight. We must also address the inherent bias in these systems to prevent further entrenchment of racial inequalities. As we navigate the complexities of integrating new technologies into our society, let us prioritise the protection of our fundamental rights and ensure that advancements serve to enhance rather than undermine our collective wellbeing.

--- Later in debate ---
Chris Philp Portrait Chris Philp (Croydon South) (Con)
- Hansard - - - Excerpts

It is a pleasure, as always, to serve under your chairmanship, Dame Siobhain. I congratulate my right hon. Friend the Member for Maldon (Sir John Whittingdale) on securing the debate and on the characteristically thoughtful manner in which he approached his speech.

I think this is the first time that I have appeared opposite the new Minister for Policing, Fire and Crime Prevention—the job that I was doing until a few months ago—so let me congratulate her on her appointment. Although I will of course hold the Government to account, I will do everything I can to constructively support her in making a great success of the job, and I really do wish her well in the role.

I want to start by reminding colleagues of the way that live facial recognition works. It is different from retrospective facial recognition, which we have not debated today and, in the interests of time, I do not propose to go into. As some Members have already said, live facial recognition starts with a watchlist of people who are wanted by the police. It is not the case that anyone can get on that watchlist, which generally comprises people who are wanted for criminal offences—often very serious offences—people who have failed to attend court, and people who are registered sex offenders, where the police want to check that they are complying with their conditions. As people walk down a high street, they are scanned, typically by a CCTV camera on a mobile van, and then compared to the watchlist. The vast majority of people are not on the watchlist, as we would expect, and their image is immediately and automatically deleted. Where a person is on the watchlist, the police will stop them and ask if they have any form of identification.

To be very clear, no one gets convicted on the basis of that facial recognition match, so it is not overturning the presumption of innocence, and if it turns out that the person stopped is not the person on the watchlist, obviously they can continue on their way. However, if they are the person on the watchlist, a normal criminal investigation will follow, with the normal standards of evidence.

Iqbal Mohamed Portrait Iqbal Mohamed
- Hansard - -

On the point about the automatic deletion of data, there are many examples, but the one I can remember is Google incognito browsing mode. That was meant to be very private—only you saw where you went—but Google was found to be storing that data, and it has been legally challenged and prosecuted for breaching the GDPR or other privacy laws. Companies may say that things are immediately deleted, but it is not always true.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is a good point; we must ensure that the operating procedures are adhered to, and I will come on to that a little later. However, to be absolutely clear, if someone is identified as a match, a normal criminal investigation is conducted to normal criminal standards. Nobody is convicted on the basis of this evidence alone—or, indeed, on the basis of this evidence at all.

Let me come to the question about racial disparity. When this technology was first introduced, about seven years ago, there were reports—accurate reports—that there was racial bias in the way that the algorithm operated. The algorithm has been developed a great deal since those days, and it has been tested definitively by the national physical laboratory, the nation’s premier testing laboratory. NPL testing is the gold standard of testing and this technology has been tested relatively recently. For the benefit of Members, I will read out what the results of that testing were:

“The NPL study found that, when used at the settings maintained by the Met”—

that is the 0.6 setting that the hon. Member for Brent East (Dawn Butler) referred to earlier—

“there was no statistically significant difference in the facial recognition technology’s accuracy across”

different demographic groups. In other words, the technology as it is being used today—not five years ago, when there were issues—has been certified by the NPL and it has been found that there is not any racial bias at the settings used.