Biometrics: Ethnic Groups

(asked on 29th January 2026) - View Source

Question to the Home Office:

To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what steps they are taking to correct and define new large language models for facial recognition to ensure errors and potential racial bias are removed.


Answered by
Lord Hanson of Flint Portrait
Lord Hanson of Flint
Minister of State (Home Office)
This question was answered on 12th February 2026

Facial recognition algorithms provided by or procured with Home Office funding for police use are required to be independently tested for accuracy and bias. Independent testing is important because it helps determine the setting in which an algorithm can safely and fairly be used.

Where potential bias or performance issues are identified, the Home Office works with policing partners to ensure their guidance, practices, and oversight processes minimise any risks arising from use of the technology.

On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course.

Reticulating Splines