Iqbal Mohamed Portrait

Iqbal Mohamed

Independent - Dewsbury and Batley

6,934 (18.2%) majority - 2024 General Election

First elected: 4th July 2024



Division Voting information

During the current Parliament, Iqbal Mohamed has voted in 291 divisions, and never against the majority of their Party.
View All Iqbal Mohamed Division Votes

Debates during the 2024 Parliament

Speeches made during Parliamentary debates are recorded in Hansard. For ease of browsing we have grouped debates into individual, departmental and legislative categories.

Sparring Partners
Hamish Falconer (Labour)
Parliamentary Under-Secretary (Foreign, Commonwealth and Development Office)
(24 debate interactions)
David Lammy (Labour)
Deputy Prime Minister
(17 debate interactions)
Nusrat Ghani (Conservative)
(13 debate interactions)
View All Sparring Partners
Department Debates
Home Office
(30 debate contributions)
Department of Health and Social Care
(23 debate contributions)
View All Department Debates
Legislation Debates
Universal Credit Act 2025
(2,561 words contributed)
Football Governance Act 2025
(1,545 words contributed)
Mental Health Act 2025
(1,458 words contributed)
View All Legislation Debates
View all Iqbal Mohamed's debates

Dewsbury and Batley Petitions

e-Petitions are administered by Parliament and allow members of the public to express support for a particular issue.

If an e-petition reaches 10,000 signatures the Government will issue a written response.

If an e-petition reaches 100,000 signatures the petition becomes eligible for a Parliamentary debate (usually Monday 4.30pm in Westminster Hall).

Petition Debates Contributed

Fund mandatory offer of testing for Type 1 Diabetes in babies, toddlers, and young children as a routine part of medical assessments at the point of care.

We urge the UK Government to scrap plans to extend ILR from 5 to 10 years. We feel that legal migrants, especially care workers, followed the rules and built lives here under the 5-year promise. We think they support vital services and deserve fairness, not shifting rules.

The Government should keep the current 5-year route to Indefinite Leave to Remain (ILR) and restrict access to government benefits for new ILR holders.

Ban the sale of fireworks to the general public to minimise the harm caused to vulnerable people and animals. Defenceless animals can die from the distress caused by fireworks.

I believe that permitting unregulated use of fireworks is an act of wide-scale cruelty to animals.

We think each year, individuals suffer because of loud fireworks. We believe horses, dogs, cats, livestock and wildlife can be terrified by noisy fireworks and many people find them intolerable.

We call on the Government to extend free bus travel to all people over 60 years old in England outside London. We believe the current situation is unjust and we want equality for everyone over 60.

We want the Government to repeal the Online Safety act.

Act to ensure deliverer of fuel, food, aid, life saving services etc. We think this shouldn't be dependant/on condition of Israeli facilitation as the Knesset voted against UNWRA access to Gaza. We think if military delivery of aid, airdrops, peacekeepers etc, are needed, then all be considered.

Support in education is a vital legal right of children with special educational needs and disabilities (SEND). We ask the government to commit to maintaining the existing law, so that vulnerable children with SEND can access education and achieve their potential.

We think the UK Government must ban all cages for laying hens as soon as possible.

We think it should also ban the use of all cage and crates for all farmed animals including:
• farrowing crates for sows
• individual calf pens
• cages for other birds, including partridges, pheasants and quail

In modern society, we believe more consideration needs to be given to animal welfare and how livestock is treated and culled.

We believe non-stun slaughter is barbaric and doesn't fit in with our culture and modern-day values and should be banned, as some EU nations have done.


Latest EDMs signed by Iqbal Mohamed

12th March 2026
Iqbal Mohamed signed this EDM on Monday 16th March 2026

Closure of Al-Aqsa Mosque during Ramadan

Tabled by: Imran Hussain (Labour - Bradford East)
That this House condemns the closure of Al-Aqsa Sanctuary in Jerusalem by Israeli authorities during the Muslim holy month of Ramadan; notes that this action infringes Palestinians’ right to freedom of worship, violates Israel’s obligations under international humanitarian law and UN resolutions, and breaches the longstanding status quo governing the …
30 signatures
(Most recent: 20 Mar 2026)
Signatures by party:
Labour: 18
Independent: 6
Green Party: 5
Scottish National Party: 1
9th March 2026
Iqbal Mohamed signed this EDM on Thursday 12th March 2026

Fipronil and Imidacloprid Pesticides

Tabled by: Rachael Maskell (Labour (Co-op) - York Central)
That this House expresses grave concern that fipronil and imidacloprid, pesticides banned for outdoor agricultural use, are still being widely used in domestic veterinary treatments for ticks and fleas in cats and dogs; recognises that the widespread use of these substances contributes significantly to freshwater pollution; highlights that these chemicals …
16 signatures
(Most recent: 16 Mar 2026)
Signatures by party:
Labour: 9
Green Party: 5
Liberal Democrat: 1
Independent: 1
View All Iqbal Mohamed's signed Early Day Motions

Commons initiatives

These initiatives were driven by Iqbal Mohamed, and are more likely to reflect personal policy preferences.

MPs who are act as Ministers or Shadow Ministers are generally restricted from performing Commons initiatives other than Urgent Questions.


Iqbal Mohamed has not been granted any Urgent Questions

Iqbal Mohamed has not been granted any Adjournment Debates

Iqbal Mohamed has not introduced any legislation before Parliament

1 Bill co-sponsored by Iqbal Mohamed

Glaucoma Care (England) Bill 2024-26
Sponsor - Shockat Adam (Ind)


Latest 50 Written Questions

(View all written questions)
Written Questions can be tabled by MPs and Lords to request specific information information on the work, policy and activities of a Government Department
4th Mar 2026
To ask the Minister for the Cabinet Office, what the annual cost is of Government contracts for the licensing of a) Zoom, b) Microsoft, c) Amazon d) Google computing platforms for the civil service.

Information on the annual cost of Government contracts for licensing across the Civil Service is not held centrally.

Chris Ward
Parliamentary Secretary (Cabinet Office)
4th Mar 2026
To ask the Minister for the Cabinet Office, what information his Department holds on whether Government Departments have sought advice from Labour Together on policy development.

This information is not held centrally.

Chris Ward
Parliamentary Secretary (Cabinet Office)
27th Feb 2026
To ask the Minister for the Cabinet Office, what was the purpose of the Prime Minister's visit to Palantir head offices in Washington DC in February 2025.

I refer the Hon Member to my answer of 10th March 2026, Official Report, PQ 112839.

Nick Thomas-Symonds
Paymaster General and Minister for the Cabinet Office
27th Feb 2026
To ask the Minister for the Cabinet Office, what discussions the Prime Minister had with the then-UK Ambassador to the US on visiting Palantir head offices in Washington DC in February 2025.

I refer the Hon Member to my answer of 10th March 2026, Official Report, PQ 112839.

Nick Thomas-Symonds
Paymaster General and Minister for the Cabinet Office
27th Feb 2026
To ask the Minister for the Cabinet Office, if his Department will take steps to (a) review all existing contract with Palantir and (b) suspend any further engagement with company until the investigations into Peter Mandelson are completed.

All contracts for any firm go through rigorous departmental processes and their decision makers. Contracts procured by Government departments are done so in line with procurement law. This was the case with all contracts to Palantir.

We utilise a range of suppliers based on operational requirements, value for money, and compliance with our security and legal obligations, with all suppliers subject to rigorous due diligence. There are robust processes in place to ensure government contracts are awarded fairly and transparently.

Chris Ward
Parliamentary Secretary (Cabinet Office)
30th Jan 2026
To ask the Secretary of State for Business and Trade, how many items of protective body armour his Department has supplied for use by journalists operating in Gaza since October 2023.

The Department for Business and Trade does not supply body armour, and the export of body armour for personal protection when accompanying its user (for their own use) is not subject to export control.

Nonetheless the Department has approved 12 licences for the export of protective body armour for use by news organisations in Israel or Palestine since October 2023. Of these, 9 relate to Media Open Individual Licences which allow export to a wide range of countries. Similar equipment has also been licensed for export for use by NGOs in the region.

The UK is appalled by the extremely high number of fatalities, arrests and detentions of media workers in the State of Palestine. We have called on all parties to fully uphold International Humanitarian Law and ensure protection of civilians including journalists.

Chris Bryant
Minister of State (Department for Business and Trade)
30th Jan 2026
To ask the Secretary of State for Business and Trade, what assessment he has made of the potential impact of the International Court of Justice Advisory Opinion on Israel and the Occupied Palestinian Territories on trade with Israel.

We respect the independence of the International Court of Justice and continue to consider the Court’s Advisory Opinion carefully, with the seriousness and rigour it deserves.

Chris Bryant
Minister of State (Department for Business and Trade)
21st Jul 2025
To ask the Secretary of State for Energy Security and Net Zero, what discussions he has had with Ofgem on reducing electricity and gas standing charges.

The Government knows that, for many consumers, too much of the burden of the bill is placed on standing charges. We are committed to lowering the cost of standing charges and are working constructively with Ofgem, on this issue. Ofgem have conducted a broad public consultation to understand the views of consumers on this issue, receiving over 5,000 responses on their 2024 discussion paper. Since then, Ofgem have been continuing work in two areas.

Firstly, Ofgem have been working to ensure that domestic consumers can choose tariffs with low or no standing charges. Ofgem took a further step towards this goal on 24th July, announcing proposals to require suppliers to offer their customers low or no standing charge tariffs from early 2026. You can read about this here: https://www.ofgem.gov.uk/policy/standing-charges-energy-price-cap-variant-next-steps.

Secondly, Ofgem have been reviewing how ‘fixed’ costs, which tend to be funded through standing charges, should be recovered in the future energy system. This includes whether those fixed costs could be recovered in more progressive ways, and we are working closely with the regulator on this.

Miatta Fahnbulleh
Parliamentary Under-Secretary (Housing, Communities and Local Government)
4th Mar 2026
To ask the Secretary of State for Science, Innovation and Technology, whether her Department has made an assessment of the potential merits of creating a UK Government cloud computing system independent from technology multinationals' services.

The Government recognises the importance of a secure and resilient cloud infrastructure for the delivery of digital public services. As set out in the Roadmap for Modern Digital Government (2026), the government is developing a National Cloud Strategy. As part of this, the government will assess how to strengthen the security and resilience of UK cloud infrastructure and improve the cloud ecosystem.

Ian Murray
Minister of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, whether he will make an assessment of the potential merits of legislative powers of direction over AI developers in the event of a loss-of-control incident.

AI models have the potential to pose novel risks by behaving in unintended or unforeseen ways. The possibility that this behaviour could lead to loss of control over advanced AI systems is taken seriously by many experts.

The AI Security Institute (AISI) is researching the development of AI capabilities that could contribute towards AI’s ability to evade human control, as well the propensity of models to engage in misaligned actions.

Furthermore, through the Alignment Project – a funding consortium distributing up to £27m for research projects – AISI is supporting further foundational research into methods to develop AI systems that operate according to our goals, without unintended or harmful behaviours.

The Government has been clear that we will legislate on AI where needed but we will do so on the basis of evidence where any serious gaps exist.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, whether his Department has undertaken scenario planning exercises for AI loss-of-control events.

AI models have the potential to pose novel risks by behaving in unintended or unforeseen ways. The possibility that this behaviour could lead to loss of control over advanced AI systems is taken seriously by many experts.

The AI Security Institute (AISI) is researching the development of AI capabilities that could contribute towards AI’s ability to evade human control, as well the propensity of models to engage in misaligned actions.

Furthermore, through the Alignment Project – a funding consortium distributing up to £27m for research projects – AISI is supporting further foundational research into methods to develop AI systems that operate according to our goals, without unintended or harmful behaviours.

The Government has been clear that we will legislate on AI where needed but we will do so on the basis of evidence where any serious gaps exist.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, what (a) short, (b) medium (c) and long-term actions he is taking to help anticipate and mitigate the potential risks of AI loss-of-control.

AI models have the potential to pose novel risks by behaving in unintended or unforeseen ways. The possibility that this behaviour could lead to loss of control over advanced AI systems is taken seriously by many experts.

The AI Security Institute (AISI) is researching the development of AI capabilities that could contribute towards AI’s ability to evade human control, as well the propensity of models to engage in misaligned actions.

Furthermore, through the Alignment Project – a funding consortium distributing up to £27m for research projects – AISI is supporting further foundational research into methods to develop AI systems that operate according to our goals, without unintended or harmful behaviours.

The Government has been clear that we will legislate on AI where needed but we will do so on the basis of evidence where any serious gaps exist.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, what mechanisms are in place to coordinate cross-government preparedness for AI loss-of-control scenarios.

AI models have the potential to pose novel risks by behaving in unintended or unforeseen ways. The possibility that this behaviour could lead to loss of control over advanced AI systems is taken seriously by many experts.

The AI Security Institute (AISI) is researching the development of AI capabilities that could contribute towards AI’s ability to evade human control, as well the propensity of models to engage in misaligned actions.

Furthermore, through the Alignment Project – a funding consortium distributing up to £27m for research projects – AISI is supporting further foundational research into methods to develop AI systems that operate according to our goals, without unintended or harmful behaviours.

The Government has been clear that we will legislate on AI where needed but we will do so on the basis of evidence where any serious gaps exist.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, how her Department defines AI loss of control; and whether that definition is shared across Departments.

AI models have the potential to pose novel risks by behaving in unintended or unforeseen ways. The possibility that this behaviour could lead to loss of control over advanced AI systems is taken seriously by many experts.

The AI Security Institute (AISI) is researching the development of AI capabilities that could contribute towards AI’s ability to evade human control, as well the propensity of models to engage in misaligned actions.

Furthermore, through the Alignment Project – a funding consortium distributing up to £27m for research projects – AISI is supporting further foundational research into methods to develop AI systems that operate according to our goals, without unintended or harmful behaviours.

The Government has been clear that we will legislate on AI where needed but we will do so on the basis of evidence where any serious gaps exist.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, what protocols are in place to help ensure rapid information-sharing with AI companies during a national AI emergency.

This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.

Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.

The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.

This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.

The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, what emergency powers the Government holds to direct private AI developers during a national security incident involving advanced AI systems.

This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.

Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.

The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.

This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.

The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, what role the AI Safety Institute plays in national security preparedness for advanced AI systems.

This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.

Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.

The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.

This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.

The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, what role the AI Security Institute plays in national security preparedness for advanced AI systems.

This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.

Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.

The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.

This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.

The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, whether he plans to publish an AI Security Strategy.

This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.

Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.

The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.

This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.

The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, what steps he is taking to help improve transparency on departmental responsibility for AI risk.

This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.

Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.

The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.

This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.

The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, what assessment she has made of the potential risks associated with advanced AI systems across government.

This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.

Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.

The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.

This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.

The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, what assessment his Department has made of the adequacy of current risk modelling for frontier AI systems.

The AI Security Institute was established to deepen our understanding of frontier AI risks.

The Institute works with the national security community and government experts to ensure AI technology delivers on its potential for UK growth, while working with companies to assess and manage the potential risks this technology poses.

The Institute’s role is also to ensure AI risk evaluation and understanding is more scientifically rigorous and reliable.

Advancing the scientific field of AI safety will help the UK ensure it has the best evidence available to navigate the uncertain trajectories that advanced AI could take.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
20th Feb 2026
To ask the Secretary of State for Science, Innovation and Technology, what information her Department holds on the Artificial Intelligence Security Institute assessment of xAI's Grok.

The AI Security Institute collaborates with leading AI developers to measure the capabilities of advanced AI and recommend risk mitigations, to ensure we stay ahead of possible AI impacts.

The Government does not give a running commentary on models being tested or which models we have been granted access to due to commercial and security sensitivities.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
29th Jan 2026
To ask the Secretary of State for Science, Innovation and Technology, whether the Artificial Intelligence Security Institute assessed xAI’s Grok for harms prior to launch.

The AI Security Institute regularly test models across leading labs. While we do not provide a running commentary on which models we test due to commercial and security reasons, it actively works with labs to improve safeguards when vulnerabilities have been identified.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
29th Jan 2026
To ask the Secretary of State for Science, Innovation and Technology, whether the Artificial Intelligence Security Institute completed a risk assessment of xAI’s Grok code before it was released to the public.

The AI Security Institute regularly test models across leading labs. While we do not provide a running commentary on which models we test due to commercial and security reasons, it actively works with labs to improve safeguards when vulnerabilities have been identified.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
19th Jan 2026
To ask the Secretary of State for Science, Innovation and Technology, if he will make an assessment of the potential merits of bringing forward legislative proposals to require the mandatory testing of generative AI models to ensure they cannot produce child sexual abuse material.

The government is committed to tackling the creation of this atrocious material. Creating, possessing, or distributing child sexual abuse material (CSAM), including AI Generated CSAM, is illegal. The Online Safety Act requires services to proactively identify and remove this content.

We are taking further action in the Crime and Policing Bill to criminalise CSAM image generators, and to ensure AI developers can directly test for and address vulnerabilities in their models which enable the production of CSAM.

The Government is clear: no option is off the table when it comes to protecting the online safety of users in the UK, and we will not hesitate to act where evidence suggests that further action is necessary.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
19th Jan 2026
To ask the Secretary of State for Science, Innovation and Technology, what steps she is taking to ensure AI tools are safe by design to prevent the creation of child sexual abuse material.

The government is committed to tackling the creation of this atrocious material. Creating, possessing, or distributing child sexual abuse material (CSAM), including AI Generated CSAM, is illegal. The Online Safety Act requires services to proactively identify and remove this content.

We are taking further action in the Crime and Policing Bill to criminalise CSAM image generators, and to ensure AI developers can directly test for and address vulnerabilities in their models which enable the production of CSAM.

The Government is clear: no option is off the table when it comes to protecting the online safety of users in the UK, and we will not hesitate to act where evidence suggests that further action is necessary.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
19th Jan 2026
To ask the Secretary of State for Science, Innovation and Technology, when she plans to introduce regulations ensuring generative AI cannot be misused to create extreme sexual abuse material involving children.

The government is committed to tackling the creation of this atrocious material. Creating, possessing, or distributing child sexual abuse material (CSAM), including AI Generated CSAM, is illegal. The Online Safety Act requires services to proactively identify and remove this content.

We are taking further action in the Crime and Policing Bill to criminalise CSAM image generators, and to ensure AI developers can directly test for and address vulnerabilities in their models which enable the production of CSAM.

The Government is clear: no option is off the table when it comes to protecting the online safety of users in the UK, and we will not hesitate to act where evidence suggests that further action is necessary.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
2nd Jan 2026
To ask the Secretary of State for Science, Innovation and Technology, whether the Secretary of State or Ministers in the Department has received representations from AI companies regarding the content or timing of the proposed AI Bill.

The Government engages with a wide range of stakeholders on its approach to regulating Artificial Intelligence, including AI companies, academics, and civil society groups.

Details of Ministerial meetings with external organisations are published in the quarterly transparency returns.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
2nd Jan 2026
To ask the Secretary of State for Science, Innovation and Technology, what steps the Government is able to take to delay or prohibit the public release of a frontier AI model in instances when the UK AI Security Institute assesses that model as posing a serious risk of assisting users in developing chemical, biological, radiological, or nuclear weapons.

We are optimistic about how AI will transform the lives of British people for the better, but advanced AI could also lead to serious security risks.

The Government believes that AI should be regulated at the point of use, and takes a context-based approach. Sectoral laws give powers to take steps where there are serious risks - for example the Procurement Act 2023 can prevent risky suppliers (including those of AI) from being used in public sector contexts, whilst a range of legislation offers protections against high-risk chemical and biological incidents.

This approach is complemented by the work of the AI Security Institute, which works in partnership with AI labs to understand the capabilities and impacts of advanced AI, and develop and test risk mitigations.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
2nd Jan 2026
To ask the Secretary of State for Science, Innovation and Technology, whether the Government has established thresholds for dangerous weapons-related capabilities in frontier AI models.

The Department for Science, Innovation and Technology (DSIT) has policy responsibility for promoting responsible AI innovation and uptake. Risks related to chemical, biological, radiological, and nuclear weapons (and other dangerous weapons), including defining thresholds for harm in these domains, are managed by a combination of the Home Office, Foreign, Commonwealth and Development Office, Cabinet Office, and the Ministry of Defence. DSIT does not set thresholds for dangerous capabilities in risk domains owned by other departments.

The AI Security Institute (AISI), as part of DSIT, focuses on researching emerging AI risks with serious security implications, such as the potential for AI to help users develop chemical and biological weapons. AISI works with a broad range of experts and leading AI companies to understand the capabilities of advanced AI and advise on technical mitigations. AISI’s research supports other government departments in taking evidence-based action to mitigate risks whilst ensuring AI delivers on its potential for growth. AISI’s Frontier AI Trends Report, published in December 2025, outlines how frontier AI risks are expected to develop in the future.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
2nd Jan 2026
To ask the Secretary of State for Science, Innovation and Technology, whether the Government has established a defined threshold of dangerous capability in frontier AI models, including capabilities relating to chemical, biological, radiological, or nuclear weapons, which would trigger Government action.

The Department for Science, Innovation and Technology (DSIT) has policy responsibility for promoting responsible AI innovation and uptake. Risks related to chemical, biological, radiological, and nuclear weapons (and other dangerous weapons), including defining thresholds for harm in these domains, are managed by a combination of the Home Office, Foreign, Commonwealth and Development Office, Cabinet Office, and the Ministry of Defence. DSIT does not set thresholds for dangerous capabilities in risk domains owned by other departments.

The AI Security Institute (AISI), as part of DSIT, focuses on researching emerging AI risks with serious security implications, such as the potential for AI to help users develop chemical and biological weapons. AISI works with a broad range of experts and leading AI companies to understand the capabilities of advanced AI and advise on technical mitigations. AISI’s research supports other government departments in taking evidence-based action to mitigate risks whilst ensuring AI delivers on its potential for growth. AISI’s Frontier AI Trends Report, published in December 2025, outlines how frontier AI risks are expected to develop in the future.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
2nd Jan 2026
To ask the Secretary of State for Science, Innovation and Technology, following Google DeepMind's provision of pre-deployment access to the UK AI Security Institute for safety testing of Gemini 3, whether the Institute received equivalent pre-deployment access to the most recent frontier AI models developed by (a) OpenAI, (b) Anthropic, (c) xAI, and (d) Meta prior to their public release.

The Government does not give a running commentary on models being tested or which models we have been granted access to due to commercial and security sensitivities. Where possible, given these sensitivities, the AI Security Institute aims to publish results.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
16th Sep 2025
To as the Secretary of State for Science, Innovation and Technology, whether his Department provides guidance to businesses on the potential impact of AI systems on employment.

We want to ensure that people have access to good, meaningful work. AI is already transforming workplaces, demanding new skills, and augmenting existing ones. Government is working to harness its benefits to boost growth, productivity, living standards, and worker wellbeing, while mitigating the risks.

The Department for Education published an analysis in 2023 outlining The impact of AI on UK jobs and training. We are currently considering our approach to updating this analysis.

Further to this, the Get Britain Working White Paper outlines how government will address labour market challenges and spread opportunity and economic prosperity that AI presents to the British public. This includes launching Skills England to create a shared national plan to boost the nation’s skills, creating more good jobs through our modern Industrial Strategy, and strengthening employment rights through DBT’s Plan to Make Work Pay.

DSIT has also published guidance for businesses adopting AI, focusing on good practice AI assurance when procuring and deploying AI systems. AI assurance could significantly manage risks and build trust, supporting business to assess and mitigate the potential impacts of AI adoption.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
11th Sep 2025
To ask the Secretary of State for Science, Innovation and Technology, what steps her Department is taking to introduce skills retraining and workforce support measures, in the context of the deployment of AI technologies in workplaces.

We want to ensure that people have access to good, meaningful work. AI will impact the labour market and Government is working to harness its benefits in terms of boosting growth, productivity, living standards, and worker wellbeing, while mitigating the risks. We’re planning for varied outcomes and monitoring data to track and prepare for these. The Get Britain Working White Paper sets out how we will address key challenges and that includes giving people the skills to get those jobs and spread opportunity to fix the foundations of our economy to seize AI’s potential.

The Government is supporting workforce readiness for AI through a range of initiatives. The new AI Skills Hub, developed by Innovate UK and PwC, provides streamlined access to digital training. This will support government priorities through tackling critical skills gaps and improving workforce readiness. We are also partnering with 11 major companies to train 7.5 million UK workers in essential AI skills by 2030.

Kanishka Narayan
Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)
10th Jul 2025
To ask the Secretary of State for Science, Innovation and Technology, whether the AI Security Institute will be given statutory powers to (a) carry out audits, (b) approve the training of powerful AI models and (c) shut down unsafe systems.

Artificial intelligence is the defining opportunity of our generation, and the Government is taking action to harness its economic benefits for UK citizens. As set out in the AI Opportunities Action Plan, we believe most AI systems should be regulated at the point of use, with our expert regulators best placed to do so. Departments are working proactively with regulators to provide clear strategic direction and support them on their AI capability needs. Through well-designed and implemented regulation, we can fuel fast, wide and safe development and adoption of AI.

10th Jul 2025
To ask the Secretary of State for Science, Innovation and Technology, whether he plans to give statutory powers to the AI Security Institute.

Artificial intelligence is the defining opportunity of our generation, and the Government is taking action to harness its economic benefits for UK citizens. As set out in the AI Opportunities Action Plan, we believe most AI systems should be regulated at the point of use, with our expert regulators best placed to do so. Departments are working proactively with regulators to provide clear strategic direction and support them on their AI capability needs. Through well-designed and implemented regulation, we can fuel fast, wide and safe development and adoption of AI.

21st Jul 2025
To ask the Secretary of State for Culture, Media and Sport, what steps she is taking to help support small and medium-sized charities, in the context of increased competition for limited grant funding.

This government recognises the vital role that charities play in providing crucial support to different groups and communities. The Civil Society Covenant sets out the terms of a new relationship between government and civil society, and is a clear statement that government sees civil society as an indispensable partner in building a better Britain.

DCMS is promoting the availability of funding for smaller charities in several ways. This includes delivery of a number of grant schemes, such as the Know Your Neighbourhood Fund and the £25.5 million Voluntary, Community, and Social Enterprise (VCSE) Energy Efficiency Scheme, which is supporting frontline organisations across England to improve their energy efficiency and sustainability.

Support for charities is also available through social investment which provides a range of tools – from grants to investments – to help charities and social enterprises grow their trading income, strengthen their resilience, and access financial support that works for them. The Dormant Assets Scheme Strategy, published in June 2025, announced that the Scheme is expected to release £440 million for England over 2024-28, with £87.5 million of this funding allocated towards social investment.

Stephanie Peacock
Parliamentary Under Secretary of State (Department for Culture, Media and Sport)
21st Jul 2025
To ask the Secretary of State for Culture, Media and Sport, what assessment she has made of the of the (a) long-term sustainability of the third sector and (b) its impact on health and social care services.

This government recognises the vital role that charitable organisations and community groups play in improving people’s health and wellbeing. These organisations, as well as the wider Voluntary, Community and Social Enterprise (VCSE) sector, are integral to the Government’s vision for national renewal and delivery of the five national missions.

DCMS supports VCSEs with their financial sustainability through a number of grant programmes, and supporting the growth of other sources of funding. The Government’s Social Enterprise Boost Fund is an up to £5.1 million package of funding to kickstart and accelerate social enterprise activity in four disadvantaged areas of England. We also provide support to charities through a range of tax reliefs and exemptions, with more than £6 billion in charitable reliefs provided to charities, Community Amateur Sports Clubs and their donors in 2023-24.

We also have the VCSE Health and Wellbeing Programme, which is a mechanism through which the Department of Health and Social Care, NHS England and UK Health Security Agency work together with VCSE organisations to drive transformation of health and care systems; promote equality; address health inequalities; and help people, families, and communities to achieve and maintain wellbeing. This will help the government to deliver on the Health Mission, and in particular the shift to prevention, through a cross-sector approach.

Stephanie Peacock
Parliamentary Under Secretary of State (Department for Culture, Media and Sport)
19th Mar 2025
To ask the Secretary of State for Culture, Media and Sport, if she will take steps to help fund the repair and reopening of Dewsbury Sports Centre.

The Government recognises the importance of ensuring public access to leisure facilities which are great spaces for people of all ages to stay fit and healthy, and which play an important role within communities.

The ongoing responsibility of providing access to public leisure facilities lies at local authority level. We share your ambition to ensure that young people in Dewsbury get the opportunities to benefit from quality sport and physical activity opportunities. The Government encourages local authorities to make investments which offer the right opportunities and facilities for the communities they serve, investing in sport and physical activity with a place-based approach, to meet the needs of individual communities.

We recognise that grassroots facilities are at the heart of communities up and down the country and are acting to support more people to get active wherever they live. On 21 March we announced £100 million funding to be delivered through the Multi-Sport Grassroots Facilities Programme, supporting high-quality, inclusive facilities across the UK.

Stephanie Peacock
Parliamentary Under Secretary of State (Department for Culture, Media and Sport)
19th Mar 2025
To ask the Secretary of State for Culture, Media and Sport, whether she plans to take steps to help fund the repair and reopening of Dewsbury Sports Centre.

The Government recognises the importance of ensuring public access to leisure facilities which are great spaces for people of all ages to stay fit and healthy, and which play an important role within communities.

The ongoing responsibility of providing access to public leisure facilities lies at local authority level. We share your ambition to ensure that young people in Dewsbury get the opportunities to benefit from quality sport and physical activity opportunities. The Government encourages local authorities to make investments which offer the right opportunities and facilities for the communities they serve, investing in sport and physical activity with a place-based approach, to meet the needs of individual communities.

We recognise that grassroots facilities are at the heart of communities up and down the country and are acting to support more people to get active wherever they live. On 21 March we announced £100 million funding to be delivered through the Multi-Sport Grassroots Facilities Programme, supporting high-quality, inclusive facilities across the UK.

Stephanie Peacock
Parliamentary Under Secretary of State (Department for Culture, Media and Sport)
22nd Jan 2026
To ask the Secretary of State for Education, what assessment her Department has made of the potential impact of the sale of the qualifications arm of City & Guilds on qualification fees, provision, workforce employment and other aspects of the further education sector.

Following the sale of City and Guilds Ltd, we understand that organisation will continue to deliver qualifications within the further education sector and work constructively with providers as usual. As the regulator of qualifications, Ofqual has responsibility for ensuring that recognised awarding organisations meet their obligations on qualifications quality and public confidence. We understand that Ofqual also monitors qualifications prices and publishes this data annually.

Josh MacAlister
Parliamentary Under-Secretary (Department for Education)
22nd Jan 2026
To ask the Secretary of State for Education, if she will review the policy of automatic off-rolling to ensure a formal review and hearing occurs before any decision is made.

This government is clear that off-rolling in any form is unacceptable, and we will continue to work closely with Ofsted to tackle it.

Pupils may leave a school roll for many reasons, including permanent exclusion, transfer to another school or change of circumstances. All schools are legally required to notify the local authority when a pupil’s name is removed from the admissions register.

The law is clear a pupil’s name can only be deleted from the admission register on the grounds prescribed in Regulation 9 of the School Attendance (Pupil Registration) (England) Regulations 2024.

Olivia Bailey
Parliamentary Under-Secretary of State (Department for Education) (Equalities)
10th Jul 2025
To ask the Secretary of State for Education, whether she has had recent discussions with (a) SEND advocacy organisations and (b) special school leaders on (i) attendance, (ii) attainment and (iii) wellbeing for students with SEND who spend part of their education learning from home.

My right hon. Friend the Secretary of State for Education and I continue to engage with special educational needs and disabilities charities, stakeholders and parents and carers on a wide variety of issues, including through weekly engagement sessions via webinars, meetings and visits. We also conduct roundtables with charities and campaigners, the most recent of which was in June.

These engagements will carry on throughout the White Paper consultation period into the autumn and beyond.

9th Jul 2025
To ask the Secretary of State for Education, what steps her Department is taking to support structured partnerships between mainstream schools and specialist SEND education providers.

The government has committed to enhancing the capability of mainstream schools to better support pupils with special educational needs and disabilities (SEND).

We are encouraged by emerging examples of effective collaboration, where special schools are working in partnership with mainstream settings to share specialist expertise.

Through our Change Programme, we are currently piloting approaches whereby alternative provision settings provide outreach support to mainstream schools. The insights gained from these pilots will inform future policy development and help shape sustainable, effective partnerships between mainstream schools and specialist SEND providers.

7th May 2025
To ask the Secretary of State for Education, how many children are on the SEND waiting list in Dewsbury and Batley constituency.

The department collects information from local authorities on the number of requests for an education, health and care (EHC) needs assessment, the number of EHC needs assessments carried out and the number of EHC plans issued on a calendar year basis. The latest figures we hold relate to the 2023 calendar year. Information for the 2024 calendar year will be published on 26 June.

The number of requests for an EHC needs assessment, the number of EHC needs assessments and the number of EHC plans issued within the statutory timeframe of 20 weeks from the date of the request for EHC needs assessment is given for Kirklees local authority in the table available here: https://explore-education-statistics.service.gov.uk/data-tables/permalink/2a676326-624e-4d03-96c7-08dd85738b16.

5th Mar 2026
To ask the Secretary of State for Environment, Food and Rural Affairs, with reference to her Department's White Paper entitled A new vision for water, updated 19 February 2026, what assessment her Department has made of the potential impact of delaying fines to water companies on levels of compliance by water companies with the proposed regulatory regime.

Payment of fines is ultimately a matter for the regulator.

Emma Hardy
Parliamentary Under-Secretary (Department for Environment, Food and Rural Affairs)
5th Mar 2026
To ask the Secretary of State for Environment, Food and Rural Affairs, how much the proposed new water regulator will cost; who will pay these costs; and what assessment she has made of the potential for these costs to be passed onto consumers in the form of higher bills.

Across all our reforms the goal is to deliver our key outcomes – environment, customers, investability – in the most effective and efficient way possible to ensure lasting value.

Emma Hardy
Parliamentary Under-Secretary (Department for Environment, Food and Rural Affairs)