To match an exact phrase, use quotation marks around the search term. eg. "Parliamentary Estate". Use "OR" or "AND" as link words to form more complex queries.


Keep yourself up-to-date with the latest developments by exploring our subscription options to receive notifications direct to your inbox

Written Question
Immigration: Fees and Charges
Thursday 26th March 2026

Asked by: Clive Lewis (Labour - Norwich South)

Question to the Home Office:

To ask the Secretary of State for the Home Department, whether an impact assessment into the policy paper on Home Office immigration and nationality fees, due to increase from 8 April 2026, has been conducted.

Answered by Mike Tapp - Parliamentary Under-Secretary (Home Office)

Where changes to fee legislation are made, Impact Assessments are produced which identify potential impacts resulting from the changes.

The published Impact Assessment includes discussion of the impacts of the fees that are due to increase from 8 April 2026: https://www.legislation.gov.uk/ukia/2026/44/pdfs/ukia_20260044_en.pdf


Division Vote (Commons)
25 Mar 2026 - Victims and Courts Bill - View Vote Context
Clive Lewis (Lab) voted Aye - in line with the party majority and in line with the House
One of 283 Labour Aye votes vs 0 Labour No votes
Vote Tally: Ayes - 286 Noes - 163
Division Vote (Commons)
25 Mar 2026 - Victims and Courts Bill - View Vote Context
Clive Lewis (Lab) voted Aye - in line with the party majority and in line with the House
One of 289 Labour Aye votes vs 0 Labour No votes
Vote Tally: Ayes - 291 Noes - 158
Division Vote (Commons)
25 Mar 2026 - Victims and Courts Bill - View Vote Context
Clive Lewis (Lab) voted Aye - in line with the party majority and in line with the House
One of 285 Labour Aye votes vs 0 Labour No votes
Vote Tally: Ayes - 292 Noes - 162
Division Vote (Commons)
25 Mar 2026 - Victims and Courts Bill - View Vote Context
Clive Lewis (Lab) voted Aye - in line with the party majority and in line with the House
One of 286 Labour Aye votes vs 0 Labour No votes
Vote Tally: Ayes - 290 Noes - 163
Division Vote (Commons)
25 Mar 2026 - Victims and Courts Bill - View Vote Context
Clive Lewis (Lab) voted Aye - in line with the party majority and in line with the House
One of 284 Labour Aye votes vs 0 Labour No votes
Vote Tally: Ayes - 300 Noes - 149
Division Vote (Commons)
25 Mar 2026 - Victims and Courts Bill - View Vote Context
Clive Lewis (Lab) voted Aye - in line with the party majority and in line with the House
One of 290 Labour Aye votes vs 0 Labour No votes
Vote Tally: Ayes - 295 Noes - 162
Written Question
Biometrics: Ethnic Groups
Wednesday 25th March 2026

Asked by: Clive Lewis (Labour - Norwich South)

Question to the Home Office:

To ask the Secretary of State for the Home Department, what assessment she has made of the adequacy of the safeguards in place to mitigate racial and other bias in the use of retrospective facial recognition technology.

Answered by Sarah Jones - Minister of State (Home Office)

The Home Secretary has commissioned His Majesty’s Chief Inspector of Constabulary (HMICFRS) to conduct an inspection of police and relevant law enforcement agencies’ use of retrospective facial recognition. The detail of the inspection and publication of the report are a matter for HMICFRS, but they will look at whether there have been or are likely to have been any wrongful arrests as a result of the use of retrospective facial recognition.

Additionally, the Home Office is aware of the risk of bias in facial recognition algorithms and supports policing in managing that risk. Manual safeguards, embedded in police training, operational practice, and guidance, require all potential matches returned from the Police National Database (PND) to be visually assessed by a trained user and investigating officer. If the trained PND user or investigator decides a facial search image provides a potential match, this must be treated as intelligence rather than evidence and additional lines of enquiry must be undertaken before any action is taken. These safeguards have always been in place, even before the independent National Physical Laboratory (NPL) testing.

The Home Office does not issue guidance on setting algorithm thresholds. The National Police Chiefs’ Council and police forces consider the impact and equitability of facial recognition technology in line with their Public Sector Equality Duty. The threshold is set for all forces by a Chief Constable on behalf of the NPCC to balance the equitability of facial searching, and the operational imperative to find true matches where they are present on PND.

The Home Office takes the findings of the National Physical Laboratory (NPL) report very seriously and has already acted. The Police Reform White Paper included a commitment to invest £26m into the development and delivery of a national facial recognition system for policing using a new algorithm. The new facial recognition algorithm has been independently tested by the NPL and this showed that it can be used at settings with no statistically significant bias. The new service will be operationally tested by the police in the coming months and will be subject to evaluation to inform future decisions about rolling out the new system with the new algorithm.


Written Question
Biometrics: Police National Database
Wednesday 25th March 2026

Asked by: Clive Lewis (Labour - Norwich South)

Question to the Home Office:

To ask the Secretary of State for the Home Department, what guidance is in place relating to the thresholds at which retrospective facial recognition searches of the Police National Database may be operated.

Answered by Sarah Jones - Minister of State (Home Office)

The Home Secretary has commissioned His Majesty’s Chief Inspector of Constabulary (HMICFRS) to conduct an inspection of police and relevant law enforcement agencies’ use of retrospective facial recognition. The detail of the inspection and publication of the report are a matter for HMICFRS, but they will look at whether there have been or are likely to have been any wrongful arrests as a result of the use of retrospective facial recognition.

Additionally, the Home Office is aware of the risk of bias in facial recognition algorithms and supports policing in managing that risk. Manual safeguards, embedded in police training, operational practice, and guidance, require all potential matches returned from the Police National Database (PND) to be visually assessed by a trained user and investigating officer. If the trained PND user or investigator decides a facial search image provides a potential match, this must be treated as intelligence rather than evidence and additional lines of enquiry must be undertaken before any action is taken. These safeguards have always been in place, even before the independent National Physical Laboratory (NPL) testing.

The Home Office does not issue guidance on setting algorithm thresholds. The National Police Chiefs’ Council and police forces consider the impact and equitability of facial recognition technology in line with their Public Sector Equality Duty. The threshold is set for all forces by a Chief Constable on behalf of the NPCC to balance the equitability of facial searching, and the operational imperative to find true matches where they are present on PND.

The Home Office takes the findings of the National Physical Laboratory (NPL) report very seriously and has already acted. The Police Reform White Paper included a commitment to invest £26m into the development and delivery of a national facial recognition system for policing using a new algorithm. The new facial recognition algorithm has been independently tested by the NPL and this showed that it can be used at settings with no statistically significant bias. The new service will be operationally tested by the police in the coming months and will be subject to evaluation to inform future decisions about rolling out the new system with the new algorithm.


Written Question
Police: Biometrics
Wednesday 25th March 2026

Asked by: Clive Lewis (Labour - Norwich South)

Question to the Home Office:

To ask the Secretary of State for the Home Department, when the Idemia facial recognition algorithm for Home Office strategic facial matching will be rolled out across police forces.

Answered by Sarah Jones - Minister of State (Home Office)

The Home Secretary has commissioned His Majesty’s Chief Inspector of Constabulary (HMICFRS) to conduct an inspection of police and relevant law enforcement agencies’ use of retrospective facial recognition. The detail of the inspection and publication of the report are a matter for HMICFRS, but they will look at whether there have been or are likely to have been any wrongful arrests as a result of the use of retrospective facial recognition.

Additionally, the Home Office is aware of the risk of bias in facial recognition algorithms and supports policing in managing that risk. Manual safeguards, embedded in police training, operational practice, and guidance, require all potential matches returned from the Police National Database (PND) to be visually assessed by a trained user and investigating officer. If the trained PND user or investigator decides a facial search image provides a potential match, this must be treated as intelligence rather than evidence and additional lines of enquiry must be undertaken before any action is taken. These safeguards have always been in place, even before the independent National Physical Laboratory (NPL) testing.

The Home Office does not issue guidance on setting algorithm thresholds. The National Police Chiefs’ Council and police forces consider the impact and equitability of facial recognition technology in line with their Public Sector Equality Duty. The threshold is set for all forces by a Chief Constable on behalf of the NPCC to balance the equitability of facial searching, and the operational imperative to find true matches where they are present on PND.

The Home Office takes the findings of the National Physical Laboratory (NPL) report very seriously and has already acted. The Police Reform White Paper included a commitment to invest £26m into the development and delivery of a national facial recognition system for policing using a new algorithm. The new facial recognition algorithm has been independently tested by the NPL and this showed that it can be used at settings with no statistically significant bias. The new service will be operationally tested by the police in the coming months and will be subject to evaluation to inform future decisions about rolling out the new system with the new algorithm.