Lord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Home Office
(1 day, 13 hours ago)
Lords ChamberMy Lords, powerful AI tools are transforming policing and reshaping how forces investigate, patrol and make decisions, often with profound implications. This amendment would make it a legal requirement for forces to disclose any algorithmic tool used in this way that might affect a person’s rights or freedoms.
The Government’s algorithmic transparency recording standard, ATRS, provides a consistent way for public bodies to explain how their algorithmic tools work, what data they use and how human oversight is maintained. Its goal is a public, searchable record of these systems. Use of the ATRS is mandatory for arm’s-length bodies delivering public services, though the previous Government did not extend that to the police, despite calls from the Committee on Standards in Public Life and from the Justice and Home Affairs Committee.
The College of Policing has now integrated the ATRS into its authorised professional practice. Forces are expected to complete an ATRS report for all relevant tools. That is welcome progress. The hope is that forces will increasingly comply to build public trust and meet their equality and data protection duties. However, while compliance is now expected, failure to record a tool is still not a legal requirement. A force could still choose not to use the ATRS, citing operational necessity, and it would not be breaking any law.
Transparency is vital across public services but nowhere more so than in policing, where these systems have the power to alter lives and restrict liberty. That is why Justice and civil liberties groups such as the Ada Lovelace and Alan Turing institutes want police use of these tools to be publicly declared and for this to be placed on a statutory footing. What is ultimately needed is a national register with real legal force—something the NPCC’s own AI lead has called for.
Government work on such a register is under way. I welcome that project but it will take time, while AI capabilities advance very rapidly indeed. The ATRS is the mechanism we have for now. This amendment would immediately strengthen it, requiring every operational AI tool from facial recognition to predictive mapping to be publicly declared.
Why does this matter? Take gait analysis, identifying people by how they move. No UK force has declared that it uses it, but its potential is recognised. Ireland is already legislating for its use in serious crime. Without a legal duty here, a UK force could deploy gait analysis tomorrow, with no public knowledge or oversight, just as facial recognition pilots proceed today with limited transparency.
This year, forces will spend nearly £2 billion on digital technology and analytics. With growing demand and limited resources, it is no surprise at all that forces turn to AI for efficiency. Yet, without total transparency, this technological shift risks further eroding public trust. Recognition of that need is growing. No one wants to return to the Met’s unlawful gangs matrix, quietly risk-scoring individuals on dubious grounds. For that reason, I urge the Government to accept this vital safeguard. It is a foundation for accountability in a field that will only grow in power and in consequence. I beg to move.
My Lords, as my noble friend Lady Doocey explained, Amendment 431 seeks to place a statutory duty on every police force in England and Wales to disclose its use of algorithmic tools where they affect the rights, entitlements or obligations of individuals.
We are witnessing a rapid proliferation of algorithmic decision-making in policing, from predictive mapping to risk assessment tools used in custody suites. Algorithms are increasingly informing how the state interacts with the citizen, yet too often these tools operate in a black box, hidden from public view and democratic scrutiny. As we have discussed in relation to other technologies such as facial recognition, the deployment of advanced technology without a clear framework undermines public trust.
This amendment requires police forces, as my noble friend explained, to complete entries in the algorithmic transparency recording standard. The ATRS is the Government’s own standard for algorithmic transparency, developed to ensure public sector accountability. My Private Member’s Bill on public authority algorithmic and automated decision-making allows for a more advanced form of reporting. In my view, the ATRS is the bare minimum required for accountability for AI use in the public sector.
Lord Katz (Lab)
My Lords, Amendment 431 deals with the use of algorithmic tools in policing. While the Government agree on the importance of transparency in the use of algorithmic tools by police forces, we do not believe that the amendment would be the optimal means of delivering either meaningful improvements in public confidence or operational benefits for policing.
The proposed duty would require police forces to disclose all algorithmic tools through the Algorithmic Transparency Recording Standard—the ATRS. The ATRS was designed for government departments and arm’s-length bodies, not for operationally independent police forces. While it is an effective tool for those organisations, its high level of technical detail and lack of narrative explanation mean that disclosures would not provide the clarity expected by the public and would risk burying key information in jargon. More importantly, mandating disclosure of all tools beyond the exemptions policy of the ATRS could inadvertently compromise operational security and policing tactics.
The Government are, however, keen to encourage transparency in the use of algorithmic tools by police forces in England and Wales to maintain the support of the public for their use and in keeping with the core tradition of policing by consent. In line with this, the Government have commissioned work on transparency measures for police use of AI and are working closely with the National Police Chiefs’ Council’s AI portfolio and the National Policing Chief Scientific Adviser to develop policies encouraging and supporting appropriate levels of transparency while safeguarding operational integrity. This approach will ensure that transparency is meaningful, proportionate and does not undermine the effectiveness of policing.
It is important to recognise that we are listening to the public in dealing with concerns that have been raised by the noble Baroness, Lady Doocey, around policing encroaching on civil liberties. Indeed, the Government commissioned and published research into public attitudes on the police’s use of AI last year. The research demonstrated strong support for AI use by the police. There are rightful concerns about the need for AI use to be underpinned by rigorous oversight, humans always being clearly involved in decision-making and transparency. These findings have been supported elsewhere; for example, in recently published research by CENTRIC, which surveyed 10,000 members of the public. That is why we are working closely with the NPCC to build upon and implement the principles of the covenant for the use of AI in policing, to which all forces in England and Wales have signed up. Of course, it is important.
The noble Baroness, Lady Doocey, referred to the use of gait analysis, and there was a comparison to live facial recognition. It is important that we understand the risks of bias and discriminatory outcomes from using any policing tool.
To be clear, police deployments must comply with the Equality Act 2010 and data protection law. Forces are required to assess potential discrimination risks and should be able to evidence that tools are necessary, proportionate and fair. Humans remain clearly involved in decision-making, and forces are expected to monitor performance against protected characteristics so that any bias is identified and addressed. Where tools cannot meet these standards in practice, they should not be deployed or must be withdrawn pending remediation.
The noble Lord, Lord Clement-Jones, referred to black box systems. To be clear, we are not comfortable with black box systems being used in policing. Policing requires—
I thank the Minister. Much of what he said about developing an alternative to the ATRS has been encouraging, but, obviously, quite a lot will also depend on—and he went on to talk about data protection—whether officers are trained in how Article 22 of the GDPR operates in terms of automated decision-making. What assurance can the Minister give about the level of knowledge and training in that area?
Lord Katz (Lab)
As I said, police deployments must comply with the Equality Act 2010 and data protection law, which, of course, include the latest data protection law under the GDPR. In relation to that specific point on Article 22 of the GDPR, I will have to write to the noble Lord to give him the full details, but, as I say, the general principle of compliance applies.
Just to finish the point I was making in reference to the noble Lord’s point about black box systems, where a system is inherently opaque, forces must have compensating controls such as rigorous testing, performance monitoring and strong human review, or not use that system.
Given these assurances—and I am grateful to the noble Lord for saying that he was encouraged, and we will wait to hear from his colleague as to whether she is encouraged by these responses—I hope the noble Baroness will be content to withdraw her amendment.
My Lords, I want to make a very brief contribution—cheekily, because I have not taken any role in this Bill. My noble friend’s amendment, what she said in support of it and the words of the noble Baroness, Lady Neville-Rolfe, are highly pertinent to the debate on the Government’s proposal to restrict jury trials. On the Tube in, I read an account of the report from the Institute for Government, which has looked at the Government’s proposals and concluded that the time savings from judge-only trials would be marginal at best, amounting to less than 2% of Crown Court time. It suggests, pertinently, that the Government
“should instead focus on how to drive up productivity across the criminal courts, investing in the workforce and technology required for the courts to operate more efficiently”.
As others who know the situation much better than I do have said, it sounds dire. One is used to all these problems of legacy systems—lack of interoperability and so on. I remember all that being debated at EU level. It is difficult and probably capital-intensive work—at least, initially—but instead of promoting these headline-grabbing gestures about abolishing jury trials, the Government need to fix the terrible lack of efficiency in the criminal justice system. I am not sure that the civil justice system is any better. Having, unfortunately, had a modest involvement in a case in the county court, I found that it was impossible to phone any staff. You might be lucky to get a response to an email after a week.
Making the system work efficiently, with all bits interacting with each other, would do a great deal more to increase productivity and save the time of all those people who are running around. One hears accounts from people who work in the criminal courts of reports not being available, files being lost and staff being absent, let alone the decrepit state of court buildings. All this investment needs to go in before the Government resort to gesture politics and things such as abolishing jury trials.
My Lords, Amendment 432 was so well introduced by my noble friend Lady Doocey. This lack of appropriate technology and how it is handicapping our police services is something that she feels very strongly about. I was delighted to hear what the noble Baroness, Lady Neville-Rolfe, and my noble friend Lady Ludford had to say, because this lack of the appropriate technology extends beyond the police services into the wider criminal justice system. This proposed new clause would address the desperate state of police data infrastructure by requiring the Secretary of State to publish a national plan to modernise police data and intelligence systems within 12 months.
As mentioned in the explanatory statement, this is not an abstract bureaucratic request. It is a direct response to, among other things, recommendation 7 of the National Audit on Group-based Child Sexual Exploitation and Abuse by the noble Baroness, Lady Casey. The audit painted a damning picture of the current landscape: intelligence systems that do not talk to one another, vital information trapped in silos and officers unable to join the dots to protect vulnerable children. It is unacceptable that, in 2025, we still rely on fragmented, obsolete IT systems to fight sophisticated networked criminality. This amendment seeks to mandate a coherent national strategy to ensure that antiquated police technology is replaced, that intelligence regarding predatory behaviour is shared effectively across police borders in real time and that we finally close the capability gaps that allow perpetrators of group-based child sexual exploitation to slip through the net.
Amendment 432 would ensure that, when the police hold vital intelligence, they have the systems to use it effectively. We cannot claim to be serious about tackling child exploitation if we do not fix the digital infrastructure that underpins our investigations.
My Lords, I am grateful to the noble Baroness for bringing forward this amendment, which seeks to require the Government to publish a national plan to modernise police data and intelligence systems in England and Wales. At its heart, this amendment speaks to a very practical and pressing concern: that our policing infrastructure must stay up to date with modern crime, particularly the most harmful and insidious forms of abuse.
Outdated and fragmented information systems can frustrate effective policing. That point was raised by the noble Baroness, Lady Casey, in the National Audit on Group-based Child Sexual Exploitation and Abuse, which noted that some police forces are still operating antiquated legacy systems that inhibit real-time data sharing and hinder co-ordinated action across forces and with partner agencies.
Group-based child sexual exploitation is a complex crime. Our response must therefore be equally networked and technologically capable. Recommendation 7 from the noble Baroness, Lady Casey, made it clear that improving data systems is essential—I emphasise that word—to ensuring children’s safety and enabling earlier intervention and more efficient information exchange. I look forward to the Minister’s outline of the steps the Government have already taken to address this issue.
This amendment seeks to take that recommendation forward by requiring a national plan with clear steps and milestones to modernise police data and intelligence systems. We strongly support the idea of having clear milestones not just for police forces and agencies but for the public and Parliament. Transparent targets allow for progress to be measured and debated, and provide operational leaders with something concrete and tangible to work towards.
We also welcome the requirement for annual progress reports to be laid before Parliament until the plan’s objectives are achieved. That level of ongoing scrutiny is important if we truly want to drive systemic improvement rather than to allow good intentions to gather dust. I therefore echo the helpful contributions of my noble friend Lady Neville-Rolfe and the noble Baroness, Lady Ludford; we really must do better.
I look forward to the Minister’s response to this amendment. I would be grateful if he would outline how the Government intend to address the problems identified in the national audit and how they will respond to the constructive challenge that this amendment presents.