(2 weeks, 3 days ago)
Lords ChamberMy Lords, Amendment 374 seeks to place statutory guardrails on the use of live facial recognition, echoing the recent calls from the Equality and Human Rights Commission. We recognise that this technology can assist the police in tackling serious crime, but it must be used responsibly. Its rapid spread into everyday policing before essential safeguards or parliamentary scrutiny are in place raises profound constitutional concerns, particularly in the policing of dissent. Amendment 374 addresses the most contentious use of this technology, at protests and public assemblies. It would prohibit live facial recognition when police impose conditions under the Public Order Act unless and until Parliament had approved a new statutory code of practice. These are moments when people exercise their fundamental rights to free expression and peaceful assembly; rights which depend on participants feeling safe from tracking or retrospective profiling.
This Bill already tightens protest offences and curbs anonymity; layering unregulated facial scanning on top of those restrictions risks further shrinking the space for lawful dissent. Many people will have perfectly legitimate reasons to think twice before attending a demonstration if they know their face may be scanned. Without clarity on how watch-lists used at protests are compiled, people have no way of knowing whether they are being flagged for genuine risk or for the views they hold. At a protest, the chilling effect is not just about being scanned; it is the fear of political profiling. If the Government cannot clearly define who is a legitimate target for facial recognition at a peaceful assembly, then such deployments are, by definition, arbitrary and cannot meet the legal test of necessity and proportionality.
Operationally, the emerging concerns around false positives and the significantly increased risk to those from minority-ethnic backgrounds are a real headache for policing large public gatherings. Deployment without a code of practice will likely result in dozens of wrongful stops to verify identities, with confrontations that divert officers from real security threats and de-escalating crowds. We have already seen how damaging these errors can be. Just in the last few weeks, an innocent south Asian man was arrested at his home in Southampton for a burglary 100 miles away in Milton Keynes. He was handcuffed and held for nearly 10 hours because he was wrongly matched to CCTV footage by a Home Office algorithm that its own research shows produces significantly higher false positives for black and Asian faces. Last month, a man was publicly ejected from his local supermarket after staff misinterpreted a facial recognition alert.
These are not minor glitches to be shrugged off. They are serious violations that erode public trust, particularly in communities already wary of state power. The Government’s consultation is welcome, but it is far too slow for the pace of change we see on our streets. Until Parliament has set clear rules, Amendment 374 is both necessary and proportionate. We must ensure that Parliament, not oblique algorithms, decides the limit of state power. I beg to move.
My Lords, we are talking today about live facial recognition at protests and why the police must not be allowed to use it until Parliament has agreed a clear and democratic code of practice. At its heart, Amendment 374 is about power and trust. Live facial recognition is not just another camera on a street corner; it is a mass surveillance tool that can scan every face in a crowd, compare people in real time against a watch-list and permanently change what it feels like to stand in the public square. Once you normalise all that at protests, you change the character of protest itself.
If people think that simply turning up at a demonstration means that their face can be scanned, logged and potentially mismatched to a suspect list, some will decide that it is safer to stay at home. That is a direct, chilling effect on the right to protest, to assemble and to speak out against, or for, the Government. We should not let that happen by stealth through a patchwork of local decisions and internal guidance that most citizens will never see. That is what is happening at the moment.
The technology itself is far from neutral. We know that facial recognition systems can and do get things wrong. They perform differently across age groups and ethnicities. A false match in the context of a protest is not a minor inconvenience. It can mean being stopped, questioned, detained or stigmatised in front of your friends, your colleagues or your community, not because of something you did but because an algorithm made a guess. Allowing that at political protests without proper rules and oversight is an invitation to injustice.
It is not enough to say, “Trust the police. We have internal policies”. The question here is not whether any particular chief constable is well-intentioned; it is whether the state should be able to scan and track people at political gatherings without Parliament having debated, defined and limited that power. In a democracy, if the Government want tools that can alter the balance of power between citizen and state, they must come to Parliament, set out the case and accept constraints.
That is why a publicly debated statutory code of practice matters. It is where we answer basic questions that are currently left in the grey zone. In what circumstances, if any, is live facial recognition at a protest justified? Who sets the watch-lists and on what criteria? What happens to images of people who are not of interest? Are they actually deleted? If so, how quickly? Who can access them and for what purposes? What independent oversight exists when things go wrong? Until those questions are answered openly, the use of live facial recognition at protests rests on unpublished risk assessments and technical documents that ordinary citizens cannot challenge and that elected representatives cannot easily amend. That is the opposite of how intrusive powers should be operated in a liberal democracy.
We should also be honest about the precedent. If we accept live facial recognition at protests now, without a code, it will be used more often and for more purposes later. Once the infrastructure is there and the practice is normalised, it will be very hard to row back. The time to set limits is before the rollout, not after the abuses. Police should not have, without parliamentary approval, the ability to quietly turn every protest into a data-harvesting exercise, watching not just the few who pose a risk but the many who are simply exercising their rights.
The principle is simple: if live facial recognition is to be used at all in the context of political protest, it must be under a clear and democratically approved code of practice, debated in Parliament, tested against our human rights obligations and subject to real oversight and redress. Until that is in place, the police should not be allowed to deploy this technology at protests.
(2 months, 1 week ago)
Lords ChamberMy Lords, Amendment 396 in my name raises fundamental issues about this part of the Bill. My concern is about Clause 138 and its clear potential to enable facial recognition searches of the DVLA’s vast image database. That would be a dramatic change. At present, drivers’ data can be accessed only for road traffic purposes.
Amendment 396 would place a safeguard in the Bill to prevent authorised persons using information obtained under these powers for the purposes of biometric searches using facial recognition technology. It would ensure that the private images of millions of citizens cannot be repurposed to feed live or retrospective facial recognition systems without full parliamentary debate and explicit consent. Around 55 million facial images are held by the DVLA; they are collected in good faith and with a clear expectation of privacy, alongside names, addresses and medical records, for the routine purposes of getting a driving licence. Turning that repository into a police biometric pool would mark a profound shift in the relationship between the state and the citizen. Combined with live facial recognition on our streets, it would create the infrastructure for real-time, population-scale surveillance, scanning the faces of tens of millions of law-abiding people as they go about their daily lives.
In effect, most of us would find ourselves on a perpetual digital watch list, our faces repeatedly checked for potential wrongdoing. That is troubling not only because of the bias and misidentification in these systems but because it is simply not proportionate policing. The public broadly support the use of technology to catch criminals, but they also want limits and safeguards. A 2024 survey by the Centre for Emerging Technology and Security and the Alan Turing Institute found that only one in five people—just 19%—trusted police forces to use biometric tools responsibly.
That anxiety is particularly strong among women. Barely three years ago, the Casey review exposed appalling misogyny and a serious abuse of data access within policing. Against that backdrop, granting digital access to millions of female drivers’ personal details and photographs is hardly reassuring, especially when previous safeguards have failed so spectacularly. Last year alone, 229 serving police officers and staff were arrested for domestic abuse-related offences, and a further 1,200 were on restricted duties linked to such allegations. The fear is real that combining facial recognition with DVLA access could allow abusers within policing to misuse these powers to trace survivors, to remove their freedom to hide and to undermine public trust still further. We also know that this technology misidentifies members of ethnic-minority communities far more frequently, compounding injustice and eroding confidence in policing by consent.
I share the ambition for policing to use data more intelligently. Forces need joined-up intelligence systems across the entire criminal justice network, but there is a world of difference between targeted access to high-risk offender data and a blank cheque to harvest the personal information of millions of people.
Clause 138 is far too wide. It allows the Secretary of State to authorise digital access for policing or law enforcement purposes, which frankly could mean anything. What information may be accessed, and for what purpose, would later be set by regulation made under the negative procedure, giving Parliament only the most cursory scrutiny of measures, with huge implications for privacy and liberty. Such sweeping powers should not be slipped through in secondary legislation. The public did not give their driving licence photographs to become part of a national face search system. There has been no debate, no consent and no assessment of the risk to those who have good reason to remain hidden. Once civic freedoms are eroded, they are very rarely rebuilt.
When the Minister replies, I hope we will hear what the Government’s policy intention is. If their intention is to keep open the possibility of using DVLA data for surveillance, they should say so and try to justify it. We know that the police have specifically asked for this. It is not good enough to say, “This is our intention”; my amendment would ensure it cannot happen. That is the safeguard the public expect and the least this Committee should demand.
My Lords, I rise to speak in favour of Amendment 396, to which I have added my name—my notes are only two pages long. It would ensure that the DVLA drivers database was not used for a purpose for which it was never intended; namely, to search drivers’ photos for a match with images collected by live facial recognition.
Facial recognition technology could be a useful tool in fighting serious crime if it was properly regulated and supervised, which is the case with other biometric technologies such as fingerprint and DNA, but currently it is open season on facial recognition, with no statutory constraints on its use or misuse. That means that this deeply invasive, mass surveillance tool poses a serious threat to the civil liberties and human rights of UK citizens. If used in combination with the DVLA drivers database, it would be a disproportionate expansion of police powers to identify and track innocent citizens across time and locations for low-level policing needs. It would give the authorities access to the biometric data of tens of millions of our fellow citizens. It is vital that safeguards are introduced in law to prevent this happening. This is precisely what Amendment 396 would do.
In Committee in the other place, the Policing Minister said that
“police forces do not conduct facial matching against images contained on the DVLA database, and the clause will not change that”.—[Official Report, Commons, Crime and Policing Bill Committee, 29/4/25; col. 442.]
But Clause 138 allows regulations to be made at a later date setting out how driver licensing information will be made accessible to law enforcement. All that Amendment 396 does is create safeguards to ensure that the regulations made under Clause 138 cannot provide for facial recognition searches of the DVLA database. I commend it to the Committee.