Asked by: Wendy Chamberlain (Liberal Democrat - North East Fife)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment her Department has made of the use of animal testing in sepsis research.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
On 11th November 2025 the government published “Replacing animals in science: A strategy to support the development, validation and uptake of alternative methods” which outlines the steps we will take to achieve this. (Replacing animals in science strategy - GOV.UK)
Sepsis is a complex and multifaceted condition, and its study presents significant scientific challenges. We will consider sepsis during the development of our areas of research interest list to determine the best path forward for new model development that drives scientific innovation, supports improved therapy development, and reduces reliance on animals.
Asked by: Baroness Ritchie of Downpatrick (Labour - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of the economic benefits of having an increased number of data centres in the UK.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
Data centres are foundational infrastructure for a modern, competitive UK economy, enabling the digital services that underpin productivity across numerous sector, from financial services and advanced manufacturing to public services and the creative industries. By enabling artificial intelligence, cloud computing and data intensive services, data centres generate productivity gains across the wider economy and reinforce the UK’s attractiveness as a crucial destination for investment.
Tech UK has estimated that UK data centres contribute £4.7 billion pounds in gross value added each year and support-tens of thousands of high-quality jobs across construction, operations and specialist supply chains. Operational employment is generally highly skilled and well paid, with wider employment supported through demand for electrical engineering, cooling, digital infrastructure and maintenance services.
HMG’s AI Growth Zone programme will unlock significant private investment and secure compute to drive AI growth, supporting high‑value local jobs and skills. HMG will also invest up to £5 million per Growth Zone, working with local areas to design tailored schemes to realise local economic benefits and boost AI adoption in local communities.
Asked by: Baroness Stowell of Beeston (Conservative - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of the impact of enterprise software licensing practices on competition and customer choice in the UK cloud services market.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
The Government prioritised the commencement of the Competition and Markets Authority’s (CMA) new powers in digital markets last year to boost competition and fairness in the digital tech sector. Although the CMA operates independently of Government, the Government gave a clear steer for the CMA to use these new powers collaboratively and proportionately.
In March, the CMA announced a package of actions to strengthen competition in business software and cloud services. This includes a Strategic Market Status investigation into Microsoft’s business software under the UK’s digital markets regime, alongside voluntary actions from Amazon and Microsoft that will improve interoperability, reduce data egress fees and make switching easier in cloud services.
Asked by: Baroness Penn (Conservative - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what plans they have for the Childhood in the Age of AI event on 20-22 April, including (1) who will be attending the summit; (2) what age ranges and topics it will address; and (3) whether it will include discussion of early years.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
The ‘Childhood in the Age of AI’ summit will be attended by a diverse group of representatives from civil society, industry, government and representatives of young people. It will address the impacts of AI on children and young people across a wide range of domains, such as education, wellbeing, development and safety. The discussions will not be restricted to any age group.
This work forms part of the government’s work to hear directly from parents and young people across the UK through our National Conversation children’s and young people’s wellbeing online.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of the safety, reliability and accountability of AI systems deployed by public services; and what steps they are taking to ensure that appropriate safeguards, testing standards and oversight mechanisms are in place.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
The Government recognises that the safe, reliable and accountable use of artificial intelligence is important to maintaining public trust in public services.
Departments deploying AI systems are expected to consider risks and impacts throughout the system lifecycle, including during design, development, deployment and operation. This includes compliance with safety, transparency, accountability, data protection rules and regulations.
The Government has published guidance to support this, including the Data and AI Ethics Framework, the AI Playbook for Government and the AI Knowledge Hub, which together provide advice on governance, risk management, testing and oversight.
In addition, the Department for Science, Innovation and Technology has published guidance on AI assurance, and a cross‑government AI Testing and Assurance Framework supports proportionate testing, evaluation and ongoing monitoring.
AI‑enabled services are also expected to meet the GOV.UK Service Standard, including demonstrating that they are safe, secure, reliable and well‑governed.
Asked by: Baroness Stowell of Beeston (Conservative - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of the implications for national resilience and economic security of high levels of concentration in the UK cloud infrastructure market.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
The Government prioritised the commencement of the Competition and Markets Authority’s (CMA) new powers in digital markets last year to boost competition and fairness in the digital tech sector. Although the CMA operates independently of Government, the Government gave a clear steer for the CMA to use these new powers collaboratively and proportionately.
In March, the CMA announced a package of actions to strengthen competition in business software and cloud services. This includes a Strategic Market Status investigation into Microsoft’s business software under the UK’s digital markets regime, alongside voluntary actions from Amazon and Microsoft that will improve interoperability, reduce data egress fees and make switching easier in cloud services. Taken together, these steps aim to address identified concerns and support a more competitive, resilient cloud market in the UK.
Asked by: Joshua Reynolds (Liberal Democrat - Maidenhead)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment her Department has made of levels of competition in the UK cloud infrastructure market; and what implications that assessment has for investment in UK cloud and AI infrastructure.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Government prioritised the commencement of the Competition and Markets Authority’s (CMA) new powers in digital markets last year to boost competition and fairness in the digital tech sector. Although the CMA operates independently of Government, the Government gave a clear steer for the CMA to use these new powers collaboratively and proportionately.
In March, the CMA announced a package of actions to strengthen competition in business software and cloud services. This includes a Strategic Market Status investigation into Microsoft’s business software under the UK’s digital markets regime, alongside voluntary actions from Amazon and Microsoft that will improve interoperability, reduce data egress fees and make switching easier in cloud services.
Asked by: James McMurdock (Independent - South Basildon and East Thurrock)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment she has made of the risk of artificial intelligence increasing the scale and sophistication of online fraud.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
AI has huge potential benefits, but can also bring new risks, including new opportunities for criminals. The OSA lists fraud as a priority offence and regulates AI-generated media in the same way as ‘real’ content, placing the same obligations on services to protect users.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. This would include, where appropriate, the use of emerging technologies to stifle criminal abuse of networks. To support compliance, Ofcom issues Codes of Practice advising services on how to be compliant with their regulatory obligations. We expect these Codes to evolve over time to include new technologies.
Asked by: James McMurdock (Independent - South Basildon and East Thurrock)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps her Department is taking to help ensure that online platforms deploy all available technologies to prevent fraud at scale.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
AI has huge potential benefits, but can also bring new risks, including new opportunities for criminals. The OSA lists fraud as a priority offence and regulates AI-generated media in the same way as ‘real’ content, placing the same obligations on services to protect users.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. This would include, where appropriate, the use of emerging technologies to stifle criminal abuse of networks. To support compliance, Ofcom issues Codes of Practice advising services on how to be compliant with their regulatory obligations. We expect these Codes to evolve over time to include new technologies.
Asked by: James McMurdock (Independent - South Basildon and East Thurrock)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what steps her Department is taking to help ensure that online platforms deploy available technologies to prevent fraud at scale.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
AI has huge potential benefits, but can also bring new risks, including new opportunities for criminals. The OSA lists fraud as a priority offence and regulates AI-generated media in the same way as ‘real’ content, placing the same obligations on services to protect users.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. This would include, where appropriate, the use of emerging technologies to stifle criminal abuse of networks. To support compliance, Ofcom issues Codes of Practice advising services on how to be compliant with their regulatory obligations. We expect these Codes to evolve over time to include new technologies.