Asked by: Lord Wigley (Plaid Cymru - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government by what year 99 per cent of Wales will have 5G reception.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
The rollout of 5G infrastructure is commercially driven and government does not hold data on where, or when, future rollout of mobile infrastructure will take place.
Government has a clear ambition for all populated areas to have higher quality 5G standalone connectivity by 2030. All three mobile network operators have committed significant investment across the UK working towards achieving this.
In Ofcom’s Connected Nations Annual Report 2025 (published November 2025), which shows coverage as of July 2025, 5G coverage is already present outside of 91% of premises across Wales, and that standalone 5G is available outside of 59% of premises
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what steps they are taking to ensure robust governance, safety evaluation and transparency in their announced partnership with Google DeepMind, including the planned automated science laboratory and access to its AI models for public services.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
The non‑binding Memorandum of Understanding between DSIT and Google DeepMind establishes a partnership for collaboration to support delivery on this government’s AI Opportunities Action Plan. This includes concrete initiatives such as priority access for UK scientists to AI tools; deepening collaboration with the AI Security Institute on AI safety and security research; and support for the development of AI-ready datasets in strategically important domains such as fusion energy.
The automated lab announced alongside the MoU is an independent Google DeepMind initiative, fully funded by Google DeepMind. The UK Government is not involved in operating or funding the lab.
The partnership with Google DeepMind will support DSIT’s efforts to explore how AI can improve productivity and service delivery across government. However, any use of AI in public services will be subject to the highest standards of safety and security, including the Data Protection Act 2018 and UK GDPR, the Government’s Data Ethics Framework, and relevant departmental assurance and security processes.
Asked by: Lord Taylor of Warwick (Non-affiliated - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government what assessment they have made of the risk that the use of AI tools by employers to research job candidates may introduce misinformation and increase the likelihood of unlawful discrimination in recruitment.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
The Government is committed to ensuring the trusted and fair use of AI.
Through the AI Opportunities Action Plan, we committed to taking steps to drive responsible adoption of AI across sectors. This includes establishing the AI Assurance Innovation Fund. We are investing £11 million in the fund and convening a national consortium of expert stakeholders to support the quality and growth of the AI assurance market.
The Government has also published guidance on Responsible AI in Recruitment. This focuses on good practice for the procurement and deployment of AI systems for HR and recruitment. It identifies key questions, considerations, and assurance mechanisms that may be used to ensure the safe and trustworthy use of AI in recruitment.
Asked by: Daisy Cooper (Liberal Democrat - St Albans)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether her Department has made an assessment how social media platforms could use in-built AI to detect and protect children against (a) cyberbullying and (b) online grooming.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The government takes tackling cyberbullying and online grooming extremely seriously.
Under the Online Safety Act, services must put in place measures to mitigate the risk of illegal activity, including grooming, and protect children from harmful content, such as bullying.
Ofcom recommends measures services can take to fulfil their duties in Codes of Practice, including using hash matching to detect and remove child sexual abuse material. Ofcom can introduce new measures in future iterations of the Codes.
On 18 December, the government published its Violence Against Women and Girls Strategy, including a world-leading ban on nudification apps. This government will not allow technology to be weaponised to humiliate and exploit women and girls.
Asked by: Jim McMahon (Labour (Co-op) - Oldham West, Chadderton and Royton)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what assessment has been made of the potential merits of introducing legislation to regulate designed in bias in AI programmes such as ChatGPT, GROK, CoPilot and others.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
A range of regulation and legislation applies to AI systems such as data protection, equality legislation and sectoral regulation. Where AI systems contravene or are non-compliant with those rules, enforcement and mechanisms for redress will apply. The government is committed to supporting regulators to promote the responsible use of AI in their sectors including identifying and addressing bias.
To further tackle this issue, the government ran the Fairness Innovation Challenge (FIC) with Innovate UK, the Equality and Human Rights Council (EHRC), and the ICO. FIC supported the development of novel of solutions to address bias and discrimination in AI systems and supported the EHRC and ICO to shape their own broader regulatory guidance.
Asked by: James McMurdock (Independent - South Basildon and East Thurrock)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, whether she has had discussions with social media companies on (a) adverts by unqualified operatives offering gas work and (b) the potential merits of implementing (i) pre‑advertising checks for Gas Safe accreditation and (ii) proactive takedowns of unsafe listings.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Secretary of State has had no discussions with social media companies on this matter.
The Gas Safety (Installation and Use) Regulations 1998 make it a criminal offence for anyone who is not on the Gas Safe Register to carry out gas work in domestic properties.
The Advertising Standards Authority requires all advertising to be legal and socially responsible. It is working with online platforms which have signed up to its Intermediary and Platform Principles to encourage compliance with the advertising codes online.
The Online Advertising Taskforce, chaired by the Minister for Creative Industries, Media and Arts, is also working to improve transparency and accountability in the online advertising supply chain.
Asked by: Tanmanjeet Singh Dhesi (Labour - Slough)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, who is the Chief Risk Officer for national security risks relating to the work of their Department.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Government identifies and assesses risks to the nation through the internal, classified National Security Risk Assessment, and the external National Risk Register, the most recent version of which was published in August
As set out in the UK Government Resilience Framework, each risk in the National Security Risk Assessment is owned and managed within Lead Government Departments
Where those risks, including national security risks, relate to the work of the Department for Science Innovation and Technology (DSIT), then they are managed through the department’s risk management processes. Within DSIT, risks are regularly reported to the department’s SLT, chaired by the Permanent Secretary, and then scrutinised by the Audit and Risk Committee (ARAC) on a regular basis.
Asked by: Mark Hendrick (Labour (Co-op) - Preston)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, what discussions her Department has had with AI companies on ensuring that AI chatbots do not promote or encourage self-harming behaviour.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
I meet regularly with civil society, industry and Ofcom to discuss online safety, including the risks of AI chatbots.
AI services allowing users to share content with one another or that search the live web are covered under the Online Safety Act and have a duty to protect users from illegal content, and children from harmful content.
To build on this, I have made encouraging self-harm a priority offence under the Act and in-scope chatbots will need to have measures in place to prevent users from encountering this content.
Asked by: Pippa Heylings (Liberal Democrat - South Cambridgeshire)
Question to the Department for Science, Innovation & Technology:
To ask the Secretary of State for Science, Innovation and Technology, with reference to her Department's policy paper entitled Replacing animals in science: A strategy to support the development, validation and uptake of alternative methods, published on 11 November 2025, what assessment she has made of the potential merits of taking legislative steps to set out the strategy’s priority areas for the targeted replacement of animal tests.
Answered by Kanishka Narayan - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
The Government’s new strategy sets out our long-term vision for a world where the use of animals in science is eliminated in all but exceptional circumstances, achieved by creating a research and innovation system that drives the development and validation of alternative methods to using animals in science. We will provide regular updates on strategy delivery including through a publicly available dashboard. Recognising that the legal framework in the UK already requires that animals are only ever used in science where there are no validated alternatives available, the government currently has no plans to legislate further on this matter.
Asked by: Lord Clement-Jones (Liberal Democrat - Life peer)
Question to the Department for Science, Innovation & Technology:
To ask His Majesty's Government when they plan to publish draft regulations under the powers granted in section 6A of the Privacy and Electronic Communications (EC Directive) Regulations 2003 to set out new exemptions to the prohibition on storing or accessing information in terminal equipment under section 6(1); and what is the extent of the involvement of the Information Commissioner's Office in drafting those regulations.
Answered by Baroness Lloyd of Effra - Baroness in Waiting (HM Household) (Whip)
The government is actively considering what exceptions could be made to regulation 6, and we shall update the House in due course.
Any regulations would be developed and drafted by the Department for Science, Innovation and Technology. The Information Commissioner’s Office (ICO) will publish recommendations for the government on this issue. The Government will consult the ICO and other interested stakeholders on the development of any regulations, as we are legally required to by the provisions in section 112(3) of the Data (Use and Access) Act 2025.