Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what assessment they have made of any bias and inconsistency of application in the use of facial recognition assessments and algorithms for Black and Asian men and women.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
The algorithm used for retrospective facial recognition searches on the Police National Database (PND) has been independently tested by the National Physical Laboratory (NPL), which found that in a limited set of circumstances it was more likely to incorrectly include some demographic groups in its search results. At the settings used by police, the NPL also found that if a correct match was in the database, the algorithm found it in 99% of searches.
We take these findings very seriously. A new algorithm has been procured and independently tested, which can be used at settings with no statistically significant bias. It is due to be operationally tested in the coming months and will be subject to evaluation.
Manual safeguards embedded in police training, operational practice and guidance have always required trained users and investigating officers to visually assess all potential matches. Training and guidance have been re-issued and promoted to remind them of these long-standing manual safeguards. The National Police Chiefs’ Council has also updated and published data protection and equality impact assessments.
Given the importance of this issue, the Home Secretary has asked HMICFRS, supported by the Forensic Science Regulator, to inspect police and relevant law enforcement agencies’ use of retrospective facial recognition, with work expected to begin before the end of March.
It is important to note that no decisions are made by the algorithm or solely on the basis of a possible match– matches are intelligence, which must be corroborated with other information, as with any other police investigation.
For live facial recognition, NPL testing found, a 1 in 6,000 false alert rate on a watchlist containing 10,000 images. In practice, the police have reported that the false alert rate has been far better than this. The NPL also found no statistically significant performance differences by gender, age, or ethnicity at the settings used by the police.
On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course.
Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what steps they are taking to correct and define new large language models for facial recognition to ensure errors and potential racial bias are removed.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
Facial recognition algorithms provided by or procured with Home Office funding for police use are required to be independently tested for accuracy and bias. Independent testing is important because it helps determine the setting in which an algorithm can safely and fairly be used.
Where potential bias or performance issues are identified, the Home Office works with policing partners to ensure their guidance, practices, and oversight processes minimise any risks arising from use of the technology.
On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course.
Asked by: Baroness Uddin (Non-affiliated - Life peer)
Question to the Home Office:
To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what steps they will take to ensure that data and information collected as a result of the increased use of facial recognition (1) remains in British jurisdiction, (2) is managed by the government, and (3) is not transferred to any third party entities or nations.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
Custody images used for retrospective facial recognition searches are stored on the Police National Database. The data is held at a secure location in the UK.
Police use of facial recognition is governed by data protection legislation, which require that any processing of biometric data is lawful, fair, proportionate and subject to appropriate safeguards.
Police forces act as the data controllers for facial recognition use and must manage data, including any international transfers, in line with data protection law and established policing standards.
On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course.
Asked by: Baroness Maclean of Redditch (Conservative - Life peer)
Question to the Home Office:
To ask His Majesty's Government what consideration they have given to recording sexual offences committed by individuals who have Gender Recognition Certificates.
Answered by Lord Hanson of Flint - Minister of State (Home Office)
In England and Wales crime recording is governed by the Home Office Counting rules which are victim-based with offences recorded according to the crime in law that has been committed, such as rape, irrespective of who committed an offence.
In regard to the management of offenders the Government considers that safeguarding is best served through strengthened management of registered sex offenders, regardless of whether or not they have acquired a Gender Recognition Certificate, including enhanced notification requirements, restrictions on changes to identity documents, and close police oversight of high‑risk individuals. These include a requirement for registered sex offenders to notify the police of any changes to their personal information such as change of name. His Majesty’s Passport Office monitors high-risk offenders to ensure they cannot obtain a new passport without police consultation. Failure to comply with requirements in this area is a criminal offence. These measures, provided for in the Crime and Policing Bill and existing legislation, ensure that the authorities have the information necessary to assess and manage risk, and we will continue to monitor these arrangements to ensure they safeguard the public.
Asked by: James Cleverly (Conservative - Braintree)
Question to the Home Office:
To ask the Secretary of State for the Home Department, pursuant to the Answer of 21 January 2026, to Question 105789, on Ministers and Public Consultation: Evidence, whether the Muslim Council of Britain is on the list of organisations subject to the policy of non-engagement.
Answered by Dan Jarvis - Minister of State (Cabinet Office)
The Home Office does not comment on specific groups.
It is up to each department to carry out due diligence when choosing to engage with any organisation or individual and, if asked, we will advise and share information to help others inform their decisions.
Asked by: Max Wilkinson (Liberal Democrat - Cheltenham)
Question to the Home Office:
To ask the Secretary of State for the Home Department, what number of people have been arrested as a result of mistaken identity due to Live Facial Recognition in the last year.
Answered by Sarah Jones - Minister of State (Home Office)
The Home Office is not aware of anyone being arrested as a result of mistaken identity, due to live facial recognition in the last year. Forces also publish information about their deployments on their website. More details on LFR deployments can be found in the Met Police Force report Live Facial Recognition Annual Report September 2025.
Police use of live facial recognition is subject to safeguards that are designed to minimise the risk of misidentifications. These are set out in the Authorised Professional Practice guidance by the College of Policing found here: Live facial recognition | College of Policing]. They must also comply with data protection, equality, and human rights laws and are subject to the Information Commissioner’s and Equality and Human Rights Commission’s oversight.
Following a possible live facial recognition alert, it is always a police officer on the ground who will decide what action, if any, to take. Facial recognition technology is not automated decision making – police officers and trained operators will always make the decisions about whether and how to use any suggested matches. This means that the technology is not the deciding factor on any arrest.
In November we launched a 10 public consultation, ending on 12 February to help shape a new framework on biometrics, facial recognition and similar technologies.
Asked by: David Davis (Conservative - Goole and Pocklington)
Question to the Home Office:
To ask the Secretary of State for the Home Department, what assessment her Department has made of the potential merits of compensation schemes for people wrongly identified by live facial recognition technology used by the police.
Answered by Sarah Jones - Minister of State (Home Office)
The Home Office has not assessed the potential merits of a specific compensation scheme for people wrongly identified by live facial recognition used by police.
The Home Office has not set a threshold for an acceptable proportion of misidentifications arising from police use of live facial recognition. However, police use of live facial recognition is subject to safeguards that are designed to minimise the risk of misidentifications. These are set out in the Authorised Professional Practice guidance by the College of Policing found here: Live facial recognition | College of Policing]. They must also comply with data protection, equality, and human rights laws and are subject to the Information Commissioner’s and Equality and Human Rights Commission’s oversight.
Following a possible live facial recognition alert, it is always a police officer on the ground who will decide what action, if any, to take. Facial recognition technology is not automated decision making – police officers and trained operators will always make the decisions about whether and how to use any suggested matches.
In November we launched a 10 public consultation, ending on 12 February to help shape a new framework on biometrics, facial recognition and similar technologies. We want to hear views on when and how the technologies should be used, and what safeguards and oversight are needed. We are aware there have been concerns with the existing laws governing the use of facial recognition, and the consultation has been designed to explore these concerns by asking questions on additional safeguards around transparency, oversight and proportionality
Asked by: David Davis (Conservative - Goole and Pocklington)
Question to the Home Office:
To ask the Secretary of State for the Home Department, whether her Department has set a threshold for an acceptable proportion of misidentifications arising from police use of live facial recognition.
Answered by Sarah Jones - Minister of State (Home Office)
The Home Office has not assessed the potential merits of a specific compensation scheme for people wrongly identified by live facial recognition used by police.
The Home Office has not set a threshold for an acceptable proportion of misidentifications arising from police use of live facial recognition. However, police use of live facial recognition is subject to safeguards that are designed to minimise the risk of misidentifications. These are set out in the Authorised Professional Practice guidance by the College of Policing found here: Live facial recognition | College of Policing]. They must also comply with data protection, equality, and human rights laws and are subject to the Information Commissioner’s and Equality and Human Rights Commission’s oversight.
Following a possible live facial recognition alert, it is always a police officer on the ground who will decide what action, if any, to take. Facial recognition technology is not automated decision making – police officers and trained operators will always make the decisions about whether and how to use any suggested matches.
In November we launched a 10 public consultation, ending on 12 February to help shape a new framework on biometrics, facial recognition and similar technologies. We want to hear views on when and how the technologies should be used, and what safeguards and oversight are needed. We are aware there have been concerns with the existing laws governing the use of facial recognition, and the consultation has been designed to explore these concerns by asking questions on additional safeguards around transparency, oversight and proportionality
Asked by: Andrew Ranger (Labour - Wrexham)
Question to the Home Office:
To ask the Secretary of State for the Home Department, whether her Department has made an assessment of the adequacy of a five-day response window for community consultation on proposals for large-scale asylum accommodation; and whether guidance will be revised to ensure adequate time is provided for local residents and stakeholders to respond.
Answered by Alex Norris - Minister of State (Home Office)
The Home Office remain committed to ensuring that any impact on local communities is kept to a minimum. Consultation with local authority officials forms a vital part of procurement of asylum accommodation. The Home Office and its accommodation providers operate a robust consultation process, which not only ensures that local authorities are aware of all ongoing procurement activity of Dispersed Accommodation in their respective areas but also allows them to share local expertise and intelligence at the earliest opportunity to inform procurement. However, to protect the safety and security of those being housed in Dispersal Accommodation (DA), we do not consult with local residents or publish details of DA address in the public domain.
Our accommodation providers ensure that consultation with local authorities is carried out in accordance with the requirements and standards set out in the Asylum Accommodation and Support Contracts. We work closely with statutory partners throughout the process to ensure effective coordination and oversight.
Asked by: Andrew Ranger (Labour - Wrexham)
Question to the Home Office:
To ask the Secretary of State for the Home Department, what oversight her Department has of consultation processes undertaken by private asylum accommodation providers when proposing new accommodation sites; and what minimum standards are required to ensure engagement with local communities.
Answered by Alex Norris - Minister of State (Home Office)
The Home Office remain committed to ensuring that any impact on local communities is kept to a minimum. Consultation with local authority officials forms a vital part of procurement of asylum accommodation. The Home Office and its accommodation providers operate a robust consultation process, which not only ensures that local authorities are aware of all ongoing procurement activity of Dispersed Accommodation in their respective areas but also allows them to share local expertise and intelligence at the earliest opportunity to inform procurement. However, to protect the safety and security of those being housed in Dispersal Accommodation (DA), we do not consult with local residents or publish details of DA address in the public domain.
Our accommodation providers ensure that consultation with local authorities is carried out in accordance with the requirements and standards set out in the Asylum Accommodation and Support Contracts. We work closely with statutory partners throughout the process to ensure effective coordination and oversight.