To match an exact phrase, use quotation marks around the search term. eg. "Parliamentary Estate". Use "OR" or "AND" as link words to form more complex queries.


Keep yourself up-to-date with the latest developments by exploring our subscription options to receive notifications direct to your inbox

Written Question
Public Sector: Data Protection
Monday 13th April 2026

Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what assessment they have made of the implications for data protection and governance of the involvement of private technology companies in the handling of sensitive data held by public authorities and regulators; and what steps they are taking to ensure that appropriate safeguards relating to data protection, accountability and transparency are in place.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

The Government is committed to ensuring that the involvement of private technology companies in the handling of sensitive data held by public authorities and regulators is subject to robust data protection, accountability, and transparency safeguards. All departments undertaking work involving personal data are required to conduct Data Protection Impact Assessments to ensure appropriate privacy, security, and fairness measures are in place. Where private‑sector tools, including algorithmic or AI‑enabled systems, are procured or used, departments must apply mandatory transparency standards and clearly document how such tools are embedded in decision‑making processes, their technical specifications, and relevant risk mitigations.

At a cross‑government level, the Government Digital Service (GDS), within the Department for Science, Innovation and Technology, is strengthening central coordination and oversight of data protection and privacy risks across government. This includes setting consistent standards, supporting departments on the responsible adoption of new technologies, and working closely with the Information Commissioner’s Office to raise data protection and information security standards across the public sector.

These measures are intended to ensure that the use of private technology companies supports innovation and improved public services, while maintaining high standards of data protection, accountability and public trust.


Written Question
Artificial Intelligence: Safety
Monday 13th April 2026

Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)

Question to the Home Office:

To ask His Majesty's Government what assessment they have made of the use of AI chatbot systems to facilitate stalking and harassment; and what steps they are taking to ensure that existing online safety, data protection and criminal law frameworks remain effective in addressing harms arising from the misuse of those technologies.

Answered by Lord Hanson of Flint - Minister of State (Home Office)

The Government continues to take steps to protect the UK public from crimes linked to the misuse of artificial intelligence (AI). This includes when AI is used to aid or facilitate stalking and harassment.

The Online Safety Act already regulates many generative AI services. However, the Government acknowledges that gaps remain, leading to inconsistent coverage of certain AI chatbot services.

We are addressing these gaps as a matter of urgency through an amendment to the Crime and Policing Bill. Through a new delegated power, we will be able to bring currently unregulated AI chatbots into the scope of the Online Safety Act. This will ensure they are subject to requirements to protect users from illegal content and activity.

We are also taking action on so called ‘nudification’ tools, legislating through the Crime and Policing Bill to criminalise the development and supply of tools for generating non-consensual intimate images.

Beyond these measures, we will continue to work closely with law enforcement to tackle the harms presented by AI. The National Centre for VAWG and Public Protection (NCVPP) continues to act as the subject matter expert on ongoing work relating to AI and VAWG in policing, to ensure that safeguarding is a core part of AI tools and models.


Written Question
Artificial Intelligence: Consumers
Monday 13th April 2026

Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what assessment they have made of the increasing deployment of generative AI systems in consumer-facing technologies such as voice assistants; and what steps they are taking to ensure that frameworks relating to data protection, consumer protection and product safety remain effective in the deployment of such technologies.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

We are committed to ensuring the UK is the leading adopter of AI in the G7, empowering British workers and businesses to seize its benefits by creating more rewarding jobs, increasing productivity and driving growth in our leading sectors.

AI assurance enables consumers to be confident that the products they buy will work as intended, which is why the Government is taking steps to build the AI assurance ecosystem that underpins safe deployment of AI, as set out in the Roadmap to Trusted Third-Party AI Assurance. This includes establishing the Centre for AI Measurement, led by the National Physical Laboratory, to accelerate the development of new, innovative AI assurance techniques.

The law also requires that all consumer products must be safe before they are placed on the market. The Office for Product Safety and Standards and local authority trading standards have enforcement powers across product safety regulations to take non-compliant or unsafe products off the UK market. The product safety framework will better respond to emerging risks posed by digital technologies, including AI-enabled and smart products, ensuring innovation does not come at the expense of consumer safety.


Written Question
Police: Biometrics
Monday 13th April 2026

Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)

Question to the Home Office:

To ask His Majesty's Government what assessment they have made of the use of facial recognition technologies by police forces and the implications of pausing deployment pending further study of potential racial bias; and what steps they are taking to ensure that such systems are subject to appropriate safeguards, oversight and standards to prevent discriminatory outcomes.

Answered by Lord Hanson of Flint - Minister of State (Home Office)

The Home Office works closely with police forces and stakeholders to assess the use of facial recognition by law enforcement. As part of this engagement, we have consulted on a new legal framework on how and when law enforcement should use biometrics and facial recognition, including the safeguards that should apply to the use of these technologies. That consultation closed on 12 February; we are considering responses and will legislate in due course.

When using the technology, the police must operate within the legal framework, including data protection, equality and human rights legislation, national guidance, a code of practice and force‑level policies. The Home Office is aware of the risk of bias in facial recognition algorithms and all police facial recognition systems funded by the Home Office must be independently tested so that they can be operated at settings where there is negligible bias.

The Home Secretary has also tasked His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services (HMICFRS), with support from the Forensic Science Regulator, to look at whether people have been affected by the bias as part of the inspection of police and relevant law enforcement agencies’ use of retrospective facial recognition. The inspection is in progress and the terms of reference have been published by HMICFRS.


Written Question
Artificial Intelligence: Internet
Thursday 9th April 2026

Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)

Question to the Department for Business and Trade:

To ask His Majesty's Government what assessment they have made of the implications for competition and market access of the integration of generative AI tools into search engines; and what steps they are taking to ensure fair access for content providers and smaller firms in digital markets.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

The Competition and Markets Authority (CMA) is the UK’s independent competition authority and is responsible for operating the digital markets regime. It has designated Google with strategic market status in general search and search advertising services. Developments in generative AI were considered during the designation investigation. The CMA is now considering imposing conduct requirements to increase competition.


Written Question
Artificial Intelligence: Public Sector
Wednesday 8th April 2026

Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what assessment they have made of the safety, reliability and accountability of AI systems deployed by public services; and what steps they are taking to ensure that appropriate safeguards, testing standards and oversight mechanisms are in place.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

The Government recognises that the safe, reliable and accountable use of artificial intelligence is important to maintaining public trust in public services.

Departments deploying AI systems are expected to consider risks and impacts throughout the system lifecycle, including during design, development, deployment and operation. This includes compliance with safety, transparency, accountability, data protection rules and regulations.

The Government has published guidance to support this, including the Data and AI Ethics Framework, the AI Playbook for Government and the AI Knowledge Hub, which together provide advice on governance, risk management, testing and oversight.

In addition, the Department for Science, Innovation and Technology has published guidance on AI assurance, and a cross‑government AI Testing and Assurance Framework supports proportionate testing, evaluation and ongoing monitoring.

AI‑enabled services are also expected to meet the GOV.UK Service Standard, including demonstrating that they are safe, secure, reliable and well‑governed.


Written Question
Artificial Intelligence: Employment
Thursday 2nd April 2026

Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what assessment they have made of the impact of AI tools on the UK’s outsourcing and contact-centre sector, including the use of AI-driven customer-service systems; and what implications this may have for employment patterns and skills demand in the sector.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

The Government recognises that AI is transforming workplaces, demanding new skills and augmenting existing roles. We have launched the AI and the Future of Work Unit - a cross‑government function dedicated to ensuring AI delivers positive outcomes for the economy, jobs, and workers. We are preparing for a range of possible futures to ensure this transformation boosts productivity and opportunities and the Government launched an assessment of AI impacts on the labour markets in January 2026.

To build a digitally skilled workforce to support long-term economic growth, drive innovation and expand individual opportunity we are supporting AI Skills Boost to upskill 10 million workers in AI skills by 2030. We have already delivered more than 1 million AI training courses have been delivered to workers across the UK.


Written Question
Clothing: Manufacturing Industries
Thursday 2nd April 2026

Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)

Question to the Department for Business and Trade:

To ask His Majesty's Government what assessment they have made of the use of AI in the fashion industry to reduce unsold inventory and improve supply chain efficiency; and what support is available to retailers to adopt such technology to enhance productivity and sustainability.

Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)

The fashion industry is increasingly using AI to improve demand forecasting, reduce unsold stock and increase supply chain efficiency, thereby supporting productivity and sustainability. Businesses can access support to adopt AI through programmes such as Made Smarter and Innovate UK, alongside wider productivity, digital adoption and skills initiatives, helping businesses invest in technologies that improve efficiency while reducing waste and environmental impact.

The government supports responsible and ethical AI adoption across our world leading creative industries, enabling organisations and freelancers to improve productivity, reach new audiences and develop new products and services.


Written Question
Alcoholic Drinks and Drugs: Misuse
Wednesday 1st April 2026

Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)

Question to the Department of Health and Social Care:

To ask His Majesty's Government what assessment they have made of the role of digital technologies in supporting treatment and recovery services for people experiencing drug and alcohol addiction.

Answered by Baroness Merron - Parliamentary Under-Secretary (Department of Health and Social Care)

The Government is continuing to invest in improvements to local alcohol and drug treatment services to ensure those in need can access high quality help and support. From 2026, all drug and alcohol treatment and recovery funding will be channelled through the Public Health Grant, with over £13.45 billion allocated across three years, including £3.4 billion ringfenced for drug and alcohol treatment and recovery.

Local authorities are responsible for assessing local needs for alcohol and drug prevention and treatment in their area, and commissioning services to meet these needs. The Government works with local treatment systems to provide a number of digital products including guidance, subject-matter expertise and data tools to help them deliver their service.

Digital products are derived from The National Drug Treatment Monitoring System and other related health datasets and made available via a dedicated website to enable local treatment systems to monitor treatment access and better manage outcomes.


Written Question
Artificial Intelligence: Research
Wednesday 1st April 2026

Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)

Question to the Department for Science, Innovation & Technology:

To ask His Majesty's Government what steps they are taking to support UK researchers in the use of artificial intelligence, including measures to promote oversight and reproducibility.

Answered by Lord Vallance of Balham - Minister of State (Department for Energy Security and Net Zero)

We are working with UKRI, universities, and other partners to ensure the safe and responsible adoption of AI tools while protecting research integrity.

Our AI for Science Strategy recognises that the integration of AI into research holds potential to be the single most impactful application of the technology, setting out 15 actions that will support UK researchers. That will include the provision of compute through the AI Research Resource; delivery of training and upskilling in AI methods; the creation, curation, and scaling of AI-ready datasets; developing access models for AI tools; developing autonomous lab infrastructure, and supporting research into the impacts of AI on the scientific process.

Additionally, the National Data Library will support the foundations for AI-enabled research by improving access to high-quality public sector data, alongside recently published guidance to help public bodies make datasets AI-ready.