(4 days, 20 hours ago)
Lords ChamberMy Lords, powerful AI tools are transforming policing and reshaping how forces investigate, patrol and make decisions, often with profound implications. This amendment would make it a legal requirement for forces to disclose any algorithmic tool used in this way that might affect a person’s rights or freedoms.
The Government’s algorithmic transparency recording standard, ATRS, provides a consistent way for public bodies to explain how their algorithmic tools work, what data they use and how human oversight is maintained. Its goal is a public, searchable record of these systems. Use of the ATRS is mandatory for arm’s-length bodies delivering public services, though the previous Government did not extend that to the police, despite calls from the Committee on Standards in Public Life and from the Justice and Home Affairs Committee.
The College of Policing has now integrated the ATRS into its authorised professional practice. Forces are expected to complete an ATRS report for all relevant tools. That is welcome progress. The hope is that forces will increasingly comply to build public trust and meet their equality and data protection duties. However, while compliance is now expected, failure to record a tool is still not a legal requirement. A force could still choose not to use the ATRS, citing operational necessity, and it would not be breaking any law.
Transparency is vital across public services but nowhere more so than in policing, where these systems have the power to alter lives and restrict liberty. That is why Justice and civil liberties groups such as the Ada Lovelace and Alan Turing institutes want police use of these tools to be publicly declared and for this to be placed on a statutory footing. What is ultimately needed is a national register with real legal force—something the NPCC’s own AI lead has called for.
Government work on such a register is under way. I welcome that project but it will take time, while AI capabilities advance very rapidly indeed. The ATRS is the mechanism we have for now. This amendment would immediately strengthen it, requiring every operational AI tool from facial recognition to predictive mapping to be publicly declared.
Why does this matter? Take gait analysis, identifying people by how they move. No UK force has declared that it uses it, but its potential is recognised. Ireland is already legislating for its use in serious crime. Without a legal duty here, a UK force could deploy gait analysis tomorrow, with no public knowledge or oversight, just as facial recognition pilots proceed today with limited transparency.
This year, forces will spend nearly £2 billion on digital technology and analytics. With growing demand and limited resources, it is no surprise at all that forces turn to AI for efficiency. Yet, without total transparency, this technological shift risks further eroding public trust. Recognition of that need is growing. No one wants to return to the Met’s unlawful gangs matrix, quietly risk-scoring individuals on dubious grounds. For that reason, I urge the Government to accept this vital safeguard. It is a foundation for accountability in a field that will only grow in power and in consequence. I beg to move.
My Lords, as my noble friend Lady Doocey explained, Amendment 431 seeks to place a statutory duty on every police force in England and Wales to disclose its use of algorithmic tools where they affect the rights, entitlements or obligations of individuals.
We are witnessing a rapid proliferation of algorithmic decision-making in policing, from predictive mapping to risk assessment tools used in custody suites. Algorithms are increasingly informing how the state interacts with the citizen, yet too often these tools operate in a black box, hidden from public view and democratic scrutiny. As we have discussed in relation to other technologies such as facial recognition, the deployment of advanced technology without a clear framework undermines public trust.
This amendment requires police forces, as my noble friend explained, to complete entries in the algorithmic transparency recording standard. The ATRS is the Government’s own standard for algorithmic transparency, developed to ensure public sector accountability. My Private Member’s Bill on public authority algorithmic and automated decision-making allows for a more advanced form of reporting. In my view, the ATRS is the bare minimum required for accountability for AI use in the public sector.