Question to the Home Office:
To ask the Secretary of State for the Home Department, pursuant to UIN 97805 answered on 15 December 2025, whether estimates have been made of the number of potential misidentifications made by police as a result of potential bias in the PND facial search algorithm.
The Home Office is aware of the risk of bias in facial recognition algorithms and supports policing in managing that risk. Initial findings from independent testing carried out by the National Physical Laboratory were shared with the Home Office in March 2024. The draft findings showed a potential bias in the algorithm used by specially trained operators in police forces to search the Police National Database (PND). The findings were explored with the National Physical Laboratory, and risks and mitigations were discussed with policing experts. Home Office Ministers were first made aware of the bias in October 2024. The final report was provided in April 2025 and updated for publication in October 2025.
The Government has tasked His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services, with support from the Forensic Science Regulator, to look at whether people have been affected by the bias as part of the inspection of police and relevant law enforcement agencies’ use of retrospective facial recognition. HMICFRS have begun scoping and planning for the inspection, which will begin before the end of March 2026. The inspection terms of reference will be published by HMICFRS.
A facial recognition match is only ever one piece of intelligence, as part of a wider police investigation. Manual safeguards, embedded in police training, operational practice, and guidance, require all potential matches returned from the PND to be visually assessed by a trained user and investigating officer. These safeguards have always been in place to minimise the risk that the wrong person in the PND is subject to investigation.