Information between 4th January 2026 - 13th February 2026
Note: This sample does not contain the most recent 2 weeks of information. Up to date samples can only be viewed by Subscribers.
Click here to view Subscription options.
| Division Votes |
|---|
|
21 Jan 2026 - Children’s Wellbeing and Schools Bill - View Vote Context Baroness Uddin voted Aye - against a party majority and against the House One of 3 Non-affiliated Aye votes vs 3 Non-affiliated No votes Tally: Ayes - 65 Noes - 162 |
|
21 Jan 2026 - Children’s Wellbeing and Schools Bill - View Vote Context Baroness Uddin voted Aye - in line with the party majority and in line with the House One of 8 Non-affiliated Aye votes vs 4 Non-affiliated No votes Tally: Ayes - 261 Noes - 150 |
|
28 Jan 2026 - Children’s Wellbeing and Schools Bill - View Vote Context Baroness Uddin voted No - against a party majority and against the House One of 2 Non-affiliated No votes vs 6 Non-affiliated Aye votes Tally: Ayes - 255 Noes - 183 |
|
28 Jan 2026 - Children’s Wellbeing and Schools Bill - View Vote Context Baroness Uddin voted No - in line with the party majority and in line with the House One of 3 Non-affiliated No votes vs 2 Non-affiliated Aye votes Tally: Ayes - 67 Noes - 191 |
|
5 Jan 2026 - Diego Garcia Military Base and British Indian Ocean Territory Bill - View Vote Context Baroness Uddin voted No - against a party majority and against the House One of 1 Non-affiliated No votes vs 2 Non-affiliated Aye votes Tally: Ayes - 131 Noes - 127 |
|
5 Jan 2026 - Diego Garcia Military Base and British Indian Ocean Territory Bill - View Vote Context Baroness Uddin voted No - against a party majority and against the House One of 1 Non-affiliated No votes vs 4 Non-affiliated Aye votes Tally: Ayes - 194 Noes - 130 |
|
6 Jan 2026 - Sentencing Bill - View Vote Context Baroness Uddin voted No - against a party majority and in line with the House One of 4 Non-affiliated No votes vs 5 Non-affiliated Aye votes Tally: Ayes - 180 Noes - 219 |
|
6 Jan 2026 - Sentencing Bill - View Vote Context Baroness Uddin voted No - in line with the party majority and in line with the House One of 4 Non-affiliated No votes vs 3 Non-affiliated Aye votes Tally: Ayes - 134 Noes - 185 |
|
10 Feb 2026 - Sustainable Aviation Fuel Bill - View Vote Context Baroness Uddin voted No - against a party majority and in line with the House One of 4 Non-affiliated No votes vs 7 Non-affiliated Aye votes Tally: Ayes - 186 Noes - 251 |
|
10 Feb 2026 - Sustainable Aviation Fuel Bill - View Vote Context Baroness Uddin voted No - against a party majority and in line with the House One of 5 Non-affiliated No votes vs 7 Non-affiliated Aye votes Tally: Ayes - 188 Noes - 258 |
| Speeches |
|---|
|
Baroness Uddin speeches from: National Police Service
Baroness Uddin contributed 1 speech (2 words) Wednesday 28th January 2026 - Lords Chamber Home Office |
|
Baroness Uddin speeches from: Age of Criminal Responsibility
Baroness Uddin contributed 1 speech (61 words) Wednesday 21st January 2026 - Lords Chamber Ministry of Justice |
|
Baroness Uddin speeches from: Youth Unemployment
Baroness Uddin contributed 1 speech (57 words) Tuesday 20th January 2026 - Lords Chamber Department for Work and Pensions |
| Written Answers |
|---|
|
Biometrics: Ethnic Groups
Asked by: Baroness Uddin (Non-affiliated - Life peer) Thursday 12th February 2026 Question to the Home Office: To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what assessment they have made of any bias and inconsistency of application in the use of facial recognition assessments and algorithms for Black and Asian men and women. Answered by Lord Hanson of Flint - Minister of State (Home Office) The algorithm used for retrospective facial recognition searches on the Police National Database (PND) has been independently tested by the National Physical Laboratory (NPL), which found that in a limited set of circumstances it was more likely to incorrectly include some demographic groups in its search results. At the settings used by police, the NPL also found that if a correct match was in the database, the algorithm found it in 99% of searches. We take these findings very seriously. A new algorithm has been procured and independently tested, which can be used at settings with no statistically significant bias. It is due to be operationally tested in the coming months and will be subject to evaluation. Manual safeguards embedded in police training, operational practice and guidance have always required trained users and investigating officers to visually assess all potential matches. Training and guidance have been re-issued and promoted to remind them of these long-standing manual safeguards. The National Police Chiefs’ Council has also updated and published data protection and equality impact assessments. Given the importance of this issue, the Home Secretary has asked HMICFRS, supported by the Forensic Science Regulator, to inspect police and relevant law enforcement agencies’ use of retrospective facial recognition, with work expected to begin before the end of March. It is important to note that no decisions are made by the algorithm or solely on the basis of a possible match– matches are intelligence, which must be corroborated with other information, as with any other police investigation. For live facial recognition, NPL testing found, a 1 in 6,000 false alert rate on a watchlist containing 10,000 images. In practice, the police have reported that the false alert rate has been far better than this. The NPL also found no statistically significant performance differences by gender, age, or ethnicity at the settings used by the police. On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course. |
|
Biometrics: Databases
Asked by: Baroness Uddin (Non-affiliated - Life peer) Wednesday 11th February 2026 Question to the Home Office: To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what steps they will take to ensure that data and information collected as a result of the increased use of facial recognition (1) remains in British jurisdiction, (2) is managed by the government, and (3) is not transferred to any third party entities or nations. Answered by Lord Hanson of Flint - Minister of State (Home Office) Custody images used for retrospective facial recognition searches are stored on the Police National Database. The data is held at a secure location in the UK. Police use of facial recognition is governed by data protection legislation, which require that any processing of biometric data is lawful, fair, proportionate and subject to appropriate safeguards. Police forces act as the data controllers for facial recognition use and must manage data, including any international transfers, in line with data protection law and established policing standards. On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course. |
|
Biometrics: Ethnic Groups
Asked by: Baroness Uddin (Non-affiliated - Life peer) Thursday 12th February 2026 Question to the Home Office: To ask His Majesty's Government, with regard to the statement by the Secretary of State for the Home Office on 26 January (HC Deb col 610), what steps they are taking to correct and define new large language models for facial recognition to ensure errors and potential racial bias are removed. Answered by Lord Hanson of Flint - Minister of State (Home Office) Facial recognition algorithms provided by or procured with Home Office funding for police use are required to be independently tested for accuracy and bias. Independent testing is important because it helps determine the setting in which an algorithm can safely and fairly be used. Where potential bias or performance issues are identified, the Home Office works with policing partners to ensure their guidance, practices, and oversight processes minimise any risks arising from use of the technology. On 4 December last year, we launched a public consultation on when and how biometrics, facial recognition and similar technologies should be used, and what safeguards and oversight are needed. Following analysis of the responses, we will publish a formal government response in due course. |