To match an exact phrase, use quotation marks around the search term. eg. "Parliamentary Estate". Use "OR" or "AND" as link words to form more complex queries.


Keep yourself up-to-date with the latest developments by exploring our subscription options to receive notifications direct to your inbox

Written Question
Biometrics: Private Sector
Tuesday 28th April 2026

Asked by: Lord Roberts of Llandudno (Liberal Democrat - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government, further to the Written Answer by Baroness Lloyd of Effra on 20 March (HL15283), what plans they have to further develop a legislative framework for the use of facial recognition software by private companies in the light of the increasing use of AI.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

The Government has no current plans to introduce a standalone legislative framework governing the use of facial recognition technology (FRT) by private companies. However, a recent consultation by the Home Office on a new legal framework for law enforcement use of biometrics and facial recognition will consider the relevance of any new developments in that area to wider public and private sector use of FRT. The consultation closed on 12 February, and responses are being analysed.

As noted in our previous correspondence, the use of FRT is already governed by a robust legal framework, including the UK GDPR and the Data Protection Act 2018. Under this framework, organisations must process data lawfully, fairly and transparently, and ensure its use is necessary and proportionate. Where used for identification, FRT involves biometric data, which is classified as special category personal data and is subject to stricter legal safeguards. Organisations must also carry out data protection impact assessments where use of such technologies is likely to pose high risks to individuals’ rights and freedoms.

The Government recognises that the use of artificial intelligence, including in FRT, continues to evolve. It therefore keeps the existing legislative framework under review, working closely with the Information Commissioner’s Office.


Written Question
Biometrics: Private Sector
Friday 20th March 2026

Asked by: Lord Roberts of Llandudno (Liberal Democrat - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what requirements are in place for private companies to inform their customers that facial recognition software is being used on the premises.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

The use of facial recognition technologies is already governed by existing legal frameworks including equalities and data protection laws, which provide significant and proportionate protections. Under UK GDPR, there is a high bar for using such technology, as the processing of biometric data for identification purposes falls into the existing definition of special category data processing.

Under the UK’s data protection framework, organisations must process personal data fairly, lawfully, and transparently, which means being clear with people about how and why their personal data is being processed. Any personal data should also be kept secure and not processed for longer than is necessary. Organisations must also carry out an impact assessment when processing activities involving new technologies are likely to result in a high risk to individuals’ rights and freedoms.

The Information Commissioner’s Office (ICO), the independent data protection regulator, has issued guidance on the use of facial recognition systems and continues to monitor developments in this area.


Written Question
Biometrics: Private Sector
Friday 20th March 2026

Asked by: Lord Roberts of Llandudno (Liberal Democrat - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what plans they have to develop a legislative framework for the use of facial recognition software by private companies.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

The use of facial recognition technologies is already governed by existing legal frameworks including equalities and data protection laws, which provide significant and proportionate protections. Under UK GDPR, there is a high bar for using such technology, as the processing of biometric data for identification purposes falls into the existing definition of special category data processing.

Under the UK’s data protection framework, organisations must process personal data fairly, lawfully, and transparently, which means being clear with people about how and why their personal data is being processed. Any personal data should also be kept secure and not processed for longer than is necessary. Organisations must also carry out an impact assessment when processing activities involving new technologies are likely to result in a high risk to individuals’ rights and freedoms.

The Information Commissioner’s Office (ICO), the independent data protection regulator, has issued guidance on the use of facial recognition systems and continues to monitor developments in this area.


Written Question
Biometrics: Private Sector
Friday 20th March 2026

Asked by: Lord Roberts of Llandudno (Liberal Democrat - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what privacy protections are in place around the use of facial recognition software by private companies.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

The use of facial recognition technologies is already governed by existing legal frameworks including equalities and data protection laws, which provide significant and proportionate protections. Under UK GDPR, there is a high bar for using such technology, as the processing of biometric data for identification purposes falls into the existing definition of special category data processing.

Under the UK’s data protection framework, organisations must process personal data fairly, lawfully, and transparently, which means being clear with people about how and why their personal data is being processed. Any personal data should also be kept secure and not processed for longer than is necessary. Organisations must also carry out an impact assessment when processing activities involving new technologies are likely to result in a high risk to individuals’ rights and freedoms.

The Information Commissioner’s Office (ICO), the independent data protection regulator, has issued guidance on the use of facial recognition systems and continues to monitor developments in this area.


Written Question
Disinformation: Middle East
Wednesday 6th November 2024

Asked by: Lord Roberts of Llandudno (Liberal Democrat - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what steps they are taking to counter misinformation relating to the conflict in the Middle East.

Answered by Baroness Jones of Whitchurch

This department takes very seriously the threat which misinformation and disinformation related to the conflict in the Middle East can pose. We have taken a multi-faceted approach and work in lockstep with various organisations, including social media companies and other government departments such as the Foreign Office.

Ministers have been clear that major social media platforms should remove illegal content, including hate speech, along with content which is in breach of their terms of service. The major social media platforms will be bound by these responsibilities when the Online Safety Act comes into force, and ministers have been clear that platforms should not wait for regulation to be in force before taking relevant action.