Crime and Policing Bill Debate

Full Debate: Read Full Debate
Department: Home Office
Moved by
379: Clause 125, page 152, line 37, at end insert—
“(4) After section 14, insert—“14A Imposition of conditions: live facial recognitionPrior to imposing conditions under either section 12 or 14, the senior officer of the Police Force in question must confirm that live facial recognition will not be in use, unless a new statutory code of practice for the use of live facial recognition surveillance in public spaces in England and Wales had previously been presented to, and approved by, both Houses of Parliament.”.”Member’s explanatory statement
This amendment ensures that police cannot use live facial recognition technology when imposing conditions on public assemblies or processions under Sections 12 or 14, unless a new, specific code of practice governing its use in public spaces has first been formally approved by both Houses of Parliament. It is intended to safeguard public privacy and civil liberties by requiring democratic oversight before this surveillance technology is deployed in such contexts.
Baroness Doocey Portrait Baroness Doocey (LD)
- Hansard - -

My Lords, in moving Amendment 379, I will speak also to Amendment 471. When used responsibly, live facial recognition can help to protect the public. The real question before us is not whether it is used but how, under what safeguards, with what scrutiny and by what authority from Parliament? At present, the answer is deeply unsatisfactory.

Police forces are rolling out live facial recognition at speed, without a clear legal framework, consistent oversight or meaningful public consultation. Its operational use has more than doubled in a year. Millions of pounds are being spent on new systems and mobile vans, yet there is still no reference to facial recognition in any Act of Parliament. Instead, the police rely on a patchwork of data protection law, the Human Rights Act and non-binding guidance. Parliament must now act urgently to put its use on a clear statutory footing. The police themselves say that this is vital to maintain public trust.

Recent Home Office testing of the police national database’s retrospective facial recognition tool found significantly higher error rates for black and Asian people than for white people. For black women, the false positive rate was almost one in 10 when the system was run on lower settings. It also performs less reliably with children and young people. The human consequences are already here: schoolchildren in uniform wrongly flagged and told to prove their identity, and a black anti-knife campaigner stopped on his way home from volunteering and asked for his fingerprints because the system got it wrong. These are not theoretical risks; they are happening now.

When this became public, Ministers ordered a review and testing of a new algorithm, which is welcome. But questions remain. Why was the bias not disclosed earlier? Why on earth was the regulator not informed? Why are biased algorithms still in use today? A false match rate of nearly one in 10 for black women is not a technical glitch; it is a civil rights issue. Running thousands of searches every month before strengthening statutory oversight only deepens public mistrust. That is why the measures in Amendment 471 deserve very serious consideration.

Amendment 379 is modest and practical. It focuses on one of the most sensitive uses of live facial recognition: protests and public assemblies. It would require the police to pause its use at such events until a new statutory code of practice, approved by Parliament, is in place. That code would set out clearly when surveillance is justified, how watch lists are compiled, what safeguards apply and, crucially, what redress is available when things go wrong.

This Committee has already heard concerns about the gradual narrowing of protest rights. Each new restriction may seem small in isolation, but together they add up. Elsewhere in the Bill, as we heard on Monday, the Government seek to criminalise those who wish to remain anonymous at protests. Combined with expanding facial recognition, that places even greater pressure on protest rights. Taken together, these measures risk discouraging peaceful dissent and undermining freedom of expression.

--- Later in debate ---
Baroness Doocey Portrait Baroness Doocey (LD)
- View Speech - Hansard - -

I do not normally disagree with the Minister, although we might be on different sides of an argument, but I found that last comment very bad. We are all on the same side—we all want to catch criminals and prevent crime. That needs to go on the record. From what he just said, it was almost as though he was suggesting that he is on the side of that but we are not. To make it clear, we are not sitting here for the sake of it; we are here because we genuinely believe in this and we want to catch criminals and prevent crime.

Lord Hanson of Flint Portrait Lord Hanson of Flint (Lab)
- Hansard - - - Excerpts

Let us put out the hand of friendship and make common cause on those issues.

To respond to the noble Baroness’s amendment, I simply say that the consultation is there. Amendment 471 would go quite a long way beyond even that which the noble Baroness, Lady Doocey, brought forward. I believe this to be a potential future crime-fighting tool. It needs regulation around it and that is what the Government are intending to do. We are very clear about that on page 5 of the consultation. How it is regulated and what is regulated, and how this is approached, is what the consultation is about, but I agree with the basic principle of the noble Baroness’s amendment. Therefore, I ask her to withdraw it.

Baroness Doocey Portrait Baroness Doocey (LD)
- Hansard - -

I would like that in writing.

I thank the Minister for his response and thank all noble Lords who have taken part in this debate. The Minister mentioned the consultation, and I am pleased that the Government will legislate, but I hope Parliament will be very much involved, because, like anything, the devil will be in the detail. Whatever comes out of that will be very important.

Can the Minister tell me what happens if, in response to the consultation, the public say that they do not want the police to access particular databases? Will the Government then take those clauses out of the Bill? Perhaps he could just clarify that.

I have a concern that, even before the consultation began, the Home Office was saying that it hoped the process would pave the way for wider rollout. That does not really inspire confidence that Ministers are keeping an open mind. A consultation should not be used as a rubber stamp; it should be the start of a genuine national conversation about the limits that a free society wants to place on mass biometric data surveillance. For that conversation to mean anything, the public need to know the full picture, how accurate the systems are, and where and when they are being used. Right now, that transparency is not there.

We have heard that the Home Office thinks that:

“Any new laws informed by the consultation would take about two years to be passed by Parliament”.


That is far too slow, given the pace of technological change, and that comment was made in December 2025. All we are asking is that Parliament sets the rules before the technology sets them for us. I hope Parliament will be involved in setting those rules. For now, I beg leave to withdraw the amendment.

Amendment 379 withdrawn.
--- Later in debate ---
Baroness Doocey Portrait Baroness Doocey (LD)
- View Speech - Hansard - -

My Lords, Amendment 396 in my name raises fundamental issues about this part of the Bill. My concern is about Clause 138 and its clear potential to enable facial recognition searches of the DVLA’s vast image database. That would be a dramatic change. At present, drivers’ data can be accessed only for road traffic purposes.

Amendment 396 would place a safeguard in the Bill to prevent authorised persons using information obtained under these powers for the purposes of biometric searches using facial recognition technology. It would ensure that the private images of millions of citizens cannot be repurposed to feed live or retrospective facial recognition systems without full parliamentary debate and explicit consent. Around 55 million facial images are held by the DVLA; they are collected in good faith and with a clear expectation of privacy, alongside names, addresses and medical records, for the routine purposes of getting a driving licence. Turning that repository into a police biometric pool would mark a profound shift in the relationship between the state and the citizen. Combined with live facial recognition on our streets, it would create the infrastructure for real-time, population-scale surveillance, scanning the faces of tens of millions of law-abiding people as they go about their daily lives.

In effect, most of us would find ourselves on a perpetual digital watch list, our faces repeatedly checked for potential wrongdoing. That is troubling not only because of the bias and misidentification in these systems but because it is simply not proportionate policing. The public broadly support the use of technology to catch criminals, but they also want limits and safeguards. A 2024 survey by the Centre for Emerging Technology and Security and the Alan Turing Institute found that only one in five people—just 19%—trusted police forces to use biometric tools responsibly.

That anxiety is particularly strong among women. Barely three years ago, the Casey review exposed appalling misogyny and a serious abuse of data access within policing. Against that backdrop, granting digital access to millions of female drivers’ personal details and photographs is hardly reassuring, especially when previous safeguards have failed so spectacularly. Last year alone, 229 serving police officers and staff were arrested for domestic abuse-related offences, and a further 1,200 were on restricted duties linked to such allegations. The fear is real that combining facial recognition with DVLA access could allow abusers within policing to misuse these powers to trace survivors, to remove their freedom to hide and to undermine public trust still further. We also know that this technology misidentifies members of ethnic-minority communities far more frequently, compounding injustice and eroding confidence in policing by consent.

I share the ambition for policing to use data more intelligently. Forces need joined-up intelligence systems across the entire criminal justice network, but there is a world of difference between targeted access to high-risk offender data and a blank cheque to harvest the personal information of millions of people.

Clause 138 is far too wide. It allows the Secretary of State to authorise digital access for policing or law enforcement purposes, which frankly could mean anything. What information may be accessed, and for what purpose, would later be set by regulation made under the negative procedure, giving Parliament only the most cursory scrutiny of measures, with huge implications for privacy and liberty. Such sweeping powers should not be slipped through in secondary legislation. The public did not give their driving licence photographs to become part of a national face search system. There has been no debate, no consent and no assessment of the risk to those who have good reason to remain hidden. Once civic freedoms are eroded, they are very rarely rebuilt.

When the Minister replies, I hope we will hear what the Government’s policy intention is. If their intention is to keep open the possibility of using DVLA data for surveillance, they should say so and try to justify it. We know that the police have specifically asked for this. It is not good enough to say, “This is our intention”; my amendment would ensure it cannot happen. That is the safeguard the public expect and the least this Committee should demand.

Lord Strasburger Portrait Lord Strasburger (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to speak in favour of Amendment 396, to which I have added my name—my notes are only two pages long. It would ensure that the DVLA drivers database was not used for a purpose for which it was never intended; namely, to search drivers’ photos for a match with images collected by live facial recognition.

Facial recognition technology could be a useful tool in fighting serious crime if it was properly regulated and supervised, which is the case with other biometric technologies such as fingerprint and DNA, but currently it is open season on facial recognition, with no statutory constraints on its use or misuse. That means that this deeply invasive, mass surveillance tool poses a serious threat to the civil liberties and human rights of UK citizens. If used in combination with the DVLA drivers database, it would be a disproportionate expansion of police powers to identify and track innocent citizens across time and locations for low-level policing needs. It would give the authorities access to the biometric data of tens of millions of our fellow citizens. It is vital that safeguards are introduced in law to prevent this happening. This is precisely what Amendment 396 would do.

In Committee in the other place, the Policing Minister said that

“police forces do not conduct facial matching against images contained on the DVLA database, and the clause will not change that”.—[Official Report, Commons, Crime and Policing Bill Committee, 29/4/25; col. 442.]

But Clause 138 allows regulations to be made at a later date setting out how driver licensing information will be made accessible to law enforcement. All that Amendment 396 does is create safeguards to ensure that the regulations made under Clause 138 cannot provide for facial recognition searches of the DVLA database. I commend it to the Committee.