First elected: 4th July 2024
Speeches made during Parliamentary debates are recorded in Hansard. For ease of browsing we have grouped debates into individual, departmental and legislative categories.
e-Petitions are administered by Parliament and allow members of the public to express support for a particular issue.
If an e-petition reaches 10,000 signatures the Government will issue a written response.
If an e-petition reaches 100,000 signatures the petition becomes eligible for a Parliamentary debate (usually Monday 4.30pm in Westminster Hall).
Funding so all infants are offered Type 1 Diabetes Testing in routine care
Gov Responded - 17 Jul 2025 Debated on - 9 Mar 2026 View Iqbal Mohamed's petition debate contributionsFund mandatory offer of testing for Type 1 Diabetes in babies, toddlers, and young children as a routine part of medical assessments at the point of care.
Protect Legal Migrants: do not implement the 10-Year ILR proposal
Gov Responded - 4 Dec 2025 Debated on - 2 Feb 2026 View Iqbal Mohamed's petition debate contributionsWe urge the UK Government to scrap plans to extend ILR from 5 to 10 years. We feel that legal migrants, especially care workers, followed the rules and built lives here under the 5-year promise. We think they support vital services and deserve fairness, not shifting rules.
Keep 5-Year ILR and Restrict Access to Benefits for New ILR Holders
Sign this petition Gov Responded - 4 Dec 2025 Debated on - 2 Feb 2026 View Iqbal Mohamed's petition debate contributionsThe Government should keep the current 5-year route to Indefinite Leave to Remain (ILR) and restrict access to government benefits for new ILR holders.
Limit the sale of fireworks to those running local council approved events only
Gov Responded - 18 Nov 2025 Debated on - 19 Jan 2026 View Iqbal Mohamed's petition debate contributionsBan the sale of fireworks to the general public to minimise the harm caused to vulnerable people and animals. Defenceless animals can die from the distress caused by fireworks.
I believe that permitting unregulated use of fireworks is an act of wide-scale cruelty to animals.
Reduce the maximum noise level for consumer fireworks from 120 to 90 decibels
Gov Responded - 7 Nov 2025 Debated on - 19 Jan 2026 View Iqbal Mohamed's petition debate contributionsWe think each year, individuals suffer because of loud fireworks. We believe horses, dogs, cats, livestock and wildlife can be terrified by noisy fireworks and many people find them intolerable.
Extend free bus travel for people over 60 in England
Gov Responded - 12 Feb 2025 Debated on - 5 Jan 2026 View Iqbal Mohamed's petition debate contributionsWe call on the Government to extend free bus travel to all people over 60 years old in England outside London. We believe the current situation is unjust and we want equality for everyone over 60.
Repeal the Online Safety Act
Gov Responded - 28 Jul 2025 Debated on - 15 Dec 2025 View Iqbal Mohamed's petition debate contributionsWe want the Government to repeal the Online Safety act.
Urgently fulfil humanitarian obligations to Gaza
Gov Responded - 8 Aug 2025 Debated on - 24 Nov 2025 View Iqbal Mohamed's petition debate contributionsAct to ensure deliverer of fuel, food, aid, life saving services etc. We think this shouldn't be dependant/on condition of Israeli facilitation as the Knesset voted against UNWRA access to Gaza. We think if military delivery of aid, airdrops, peacekeepers etc, are needed, then all be considered.
Retain legal right to assessment and support in education for children with SEND
Gov Responded - 5 Aug 2025 Debated on - 15 Sep 2025 View Iqbal Mohamed's petition debate contributionsSupport in education is a vital legal right of children with special educational needs and disabilities (SEND). We ask the government to commit to maintaining the existing law, so that vulnerable children with SEND can access education and achieve their potential.
End the use of cages and crates for all farmed animals
Gov Responded - 17 Feb 2025 Debated on - 16 Jun 2025 View Iqbal Mohamed's petition debate contributionsWe think the UK Government must ban all cages for laying hens as soon as possible.
We think it should also ban the use of all cage and crates for all farmed animals including:
• farrowing crates for sows
• individual calf pens
• cages for other birds, including partridges, pheasants and quail
Ban non-stun slaughter in the UK
Gov Responded - 10 Jan 2025 Debated on - 9 Jun 2025 View Iqbal Mohamed's petition debate contributionsIn modern society, we believe more consideration needs to be given to animal welfare and how livestock is treated and culled.
We believe non-stun slaughter is barbaric and doesn't fit in with our culture and modern-day values and should be banned, as some EU nations have done.
These initiatives were driven by Iqbal Mohamed, and are more likely to reflect personal policy preferences.
MPs who are act as Ministers or Shadow Ministers are generally restricted from performing Commons initiatives other than Urgent Questions.
Iqbal Mohamed has not been granted any Urgent Questions
Iqbal Mohamed has not been granted any Adjournment Debates
Iqbal Mohamed has not introduced any legislation before Parliament
Glaucoma Care (England) Bill 2024-26
Sponsor - Shockat Adam (Ind)
Information on the annual cost of Government contracts for licensing across the Civil Service is not held centrally.
I refer the Hon Member to my answer of 10th March 2026, Official Report, PQ 112839.
I refer the Hon Member to my answer of 10th March 2026, Official Report, PQ 112839.
All contracts for any firm go through rigorous departmental processes and their decision makers. Contracts procured by Government departments are done so in line with procurement law. This was the case with all contracts to Palantir.
We utilise a range of suppliers based on operational requirements, value for money, and compliance with our security and legal obligations, with all suppliers subject to rigorous due diligence. There are robust processes in place to ensure government contracts are awarded fairly and transparently.
The Department for Business and Trade does not supply body armour, and the export of body armour for personal protection when accompanying its user (for their own use) is not subject to export control.
Nonetheless the Department has approved 12 licences for the export of protective body armour for use by news organisations in Israel or Palestine since October 2023. Of these, 9 relate to Media Open Individual Licences which allow export to a wide range of countries. Similar equipment has also been licensed for export for use by NGOs in the region.
The UK is appalled by the extremely high number of fatalities, arrests and detentions of media workers in the State of Palestine. We have called on all parties to fully uphold International Humanitarian Law and ensure protection of civilians including journalists.
We respect the independence of the International Court of Justice and continue to consider the Court’s Advisory Opinion carefully, with the seriousness and rigour it deserves.
The Government knows that, for many consumers, too much of the burden of the bill is placed on standing charges. We are committed to lowering the cost of standing charges and are working constructively with Ofgem, on this issue. Ofgem have conducted a broad public consultation to understand the views of consumers on this issue, receiving over 5,000 responses on their 2024 discussion paper. Since then, Ofgem have been continuing work in two areas.
Firstly, Ofgem have been working to ensure that domestic consumers can choose tariffs with low or no standing charges. Ofgem took a further step towards this goal on 24th July, announcing proposals to require suppliers to offer their customers low or no standing charge tariffs from early 2026. You can read about this here: https://www.ofgem.gov.uk/policy/standing-charges-energy-price-cap-variant-next-steps.
Secondly, Ofgem have been reviewing how ‘fixed’ costs, which tend to be funded through standing charges, should be recovered in the future energy system. This includes whether those fixed costs could be recovered in more progressive ways, and we are working closely with the regulator on this.
The Government recognises the importance of a secure and resilient cloud infrastructure for the delivery of digital public services. As set out in the Roadmap for Modern Digital Government (2026), the government is developing a National Cloud Strategy. As part of this, the government will assess how to strengthen the security and resilience of UK cloud infrastructure and improve the cloud ecosystem.
AI models have the potential to pose novel risks by behaving in unintended or unforeseen ways. The possibility that this behaviour could lead to loss of control over advanced AI systems is taken seriously by many experts.
The AI Security Institute (AISI) is researching the development of AI capabilities that could contribute towards AI’s ability to evade human control, as well the propensity of models to engage in misaligned actions.
Furthermore, through the Alignment Project – a funding consortium distributing up to £27m for research projects – AISI is supporting further foundational research into methods to develop AI systems that operate according to our goals, without unintended or harmful behaviours.
The Government has been clear that we will legislate on AI where needed but we will do so on the basis of evidence where any serious gaps exist.
AI models have the potential to pose novel risks by behaving in unintended or unforeseen ways. The possibility that this behaviour could lead to loss of control over advanced AI systems is taken seriously by many experts.
The AI Security Institute (AISI) is researching the development of AI capabilities that could contribute towards AI’s ability to evade human control, as well the propensity of models to engage in misaligned actions.
Furthermore, through the Alignment Project – a funding consortium distributing up to £27m for research projects – AISI is supporting further foundational research into methods to develop AI systems that operate according to our goals, without unintended or harmful behaviours.
The Government has been clear that we will legislate on AI where needed but we will do so on the basis of evidence where any serious gaps exist.
AI models have the potential to pose novel risks by behaving in unintended or unforeseen ways. The possibility that this behaviour could lead to loss of control over advanced AI systems is taken seriously by many experts.
The AI Security Institute (AISI) is researching the development of AI capabilities that could contribute towards AI’s ability to evade human control, as well the propensity of models to engage in misaligned actions.
Furthermore, through the Alignment Project – a funding consortium distributing up to £27m for research projects – AISI is supporting further foundational research into methods to develop AI systems that operate according to our goals, without unintended or harmful behaviours.
The Government has been clear that we will legislate on AI where needed but we will do so on the basis of evidence where any serious gaps exist.
AI models have the potential to pose novel risks by behaving in unintended or unforeseen ways. The possibility that this behaviour could lead to loss of control over advanced AI systems is taken seriously by many experts.
The AI Security Institute (AISI) is researching the development of AI capabilities that could contribute towards AI’s ability to evade human control, as well the propensity of models to engage in misaligned actions.
Furthermore, through the Alignment Project – a funding consortium distributing up to £27m for research projects – AISI is supporting further foundational research into methods to develop AI systems that operate according to our goals, without unintended or harmful behaviours.
The Government has been clear that we will legislate on AI where needed but we will do so on the basis of evidence where any serious gaps exist.
AI models have the potential to pose novel risks by behaving in unintended or unforeseen ways. The possibility that this behaviour could lead to loss of control over advanced AI systems is taken seriously by many experts.
The AI Security Institute (AISI) is researching the development of AI capabilities that could contribute towards AI’s ability to evade human control, as well the propensity of models to engage in misaligned actions.
Furthermore, through the Alignment Project – a funding consortium distributing up to £27m for research projects – AISI is supporting further foundational research into methods to develop AI systems that operate according to our goals, without unintended or harmful behaviours.
The Government has been clear that we will legislate on AI where needed but we will do so on the basis of evidence where any serious gaps exist.
This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.
Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.
The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.
This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.
The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.
This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.
Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.
The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.
This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.
The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.
This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.
Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.
The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.
This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.
The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.
This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.
Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.
The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.
This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.
The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.
This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.
Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.
The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.
This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.
The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.
This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.
Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.
The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.
This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.
The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.
This government is taking a long‑term, science‑led approach to understanding and preparing for emerging AI risks, including the possibility of very rapid progress with transformative impacts on society and national security.
Through close collaboration with industry and international allies, the government has deepened its understanding of risks, improved AI model security, and built UK resilience against threats.
The Government’s National Security Strategy sets out our intent to build the UK national security agenda for AI and other frontier technologies. This agenda will support the development of the UK's AI-enabled defence and security capabilities.
This is complimented by the work of the AI Security Institute (AISI), which focuses on emerging AI risks with serious security implications, including cyber misuse, chemical or biological risks, and autonomous AI capabilities.
The Government will remain vigilant and prepare for new AI risks, including rapid advancements that could affect society and national security.
The AI Security Institute was established to deepen our understanding of frontier AI risks.
The Institute works with the national security community and government experts to ensure AI technology delivers on its potential for UK growth, while working with companies to assess and manage the potential risks this technology poses.
The Institute’s role is also to ensure AI risk evaluation and understanding is more scientifically rigorous and reliable.
Advancing the scientific field of AI safety will help the UK ensure it has the best evidence available to navigate the uncertain trajectories that advanced AI could take.
The AI Security Institute collaborates with leading AI developers to measure the capabilities of advanced AI and recommend risk mitigations, to ensure we stay ahead of possible AI impacts.
The Government does not give a running commentary on models being tested or which models we have been granted access to due to commercial and security sensitivities.
The AI Security Institute regularly test models across leading labs. While we do not provide a running commentary on which models we test due to commercial and security reasons, it actively works with labs to improve safeguards when vulnerabilities have been identified.
The AI Security Institute regularly test models across leading labs. While we do not provide a running commentary on which models we test due to commercial and security reasons, it actively works with labs to improve safeguards when vulnerabilities have been identified.
The government is committed to tackling the creation of this atrocious material. Creating, possessing, or distributing child sexual abuse material (CSAM), including AI Generated CSAM, is illegal. The Online Safety Act requires services to proactively identify and remove this content.
We are taking further action in the Crime and Policing Bill to criminalise CSAM image generators, and to ensure AI developers can directly test for and address vulnerabilities in their models which enable the production of CSAM.
The Government is clear: no option is off the table when it comes to protecting the online safety of users in the UK, and we will not hesitate to act where evidence suggests that further action is necessary.
The government is committed to tackling the creation of this atrocious material. Creating, possessing, or distributing child sexual abuse material (CSAM), including AI Generated CSAM, is illegal. The Online Safety Act requires services to proactively identify and remove this content.
We are taking further action in the Crime and Policing Bill to criminalise CSAM image generators, and to ensure AI developers can directly test for and address vulnerabilities in their models which enable the production of CSAM.
The Government is clear: no option is off the table when it comes to protecting the online safety of users in the UK, and we will not hesitate to act where evidence suggests that further action is necessary.
The government is committed to tackling the creation of this atrocious material. Creating, possessing, or distributing child sexual abuse material (CSAM), including AI Generated CSAM, is illegal. The Online Safety Act requires services to proactively identify and remove this content.
We are taking further action in the Crime and Policing Bill to criminalise CSAM image generators, and to ensure AI developers can directly test for and address vulnerabilities in their models which enable the production of CSAM.
The Government is clear: no option is off the table when it comes to protecting the online safety of users in the UK, and we will not hesitate to act where evidence suggests that further action is necessary.
The Government engages with a wide range of stakeholders on its approach to regulating Artificial Intelligence, including AI companies, academics, and civil society groups.
Details of Ministerial meetings with external organisations are published in the quarterly transparency returns.
We are optimistic about how AI will transform the lives of British people for the better, but advanced AI could also lead to serious security risks.
The Government believes that AI should be regulated at the point of use, and takes a context-based approach. Sectoral laws give powers to take steps where there are serious risks - for example the Procurement Act 2023 can prevent risky suppliers (including those of AI) from being used in public sector contexts, whilst a range of legislation offers protections against high-risk chemical and biological incidents.
This approach is complemented by the work of the AI Security Institute, which works in partnership with AI labs to understand the capabilities and impacts of advanced AI, and develop and test risk mitigations.
The Department for Science, Innovation and Technology (DSIT) has policy responsibility for promoting responsible AI innovation and uptake. Risks related to chemical, biological, radiological, and nuclear weapons (and other dangerous weapons), including defining thresholds for harm in these domains, are managed by a combination of the Home Office, Foreign, Commonwealth and Development Office, Cabinet Office, and the Ministry of Defence. DSIT does not set thresholds for dangerous capabilities in risk domains owned by other departments.
The AI Security Institute (AISI), as part of DSIT, focuses on researching emerging AI risks with serious security implications, such as the potential for AI to help users develop chemical and biological weapons. AISI works with a broad range of experts and leading AI companies to understand the capabilities of advanced AI and advise on technical mitigations. AISI’s research supports other government departments in taking evidence-based action to mitigate risks whilst ensuring AI delivers on its potential for growth. AISI’s Frontier AI Trends Report, published in December 2025, outlines how frontier AI risks are expected to develop in the future.
The Department for Science, Innovation and Technology (DSIT) has policy responsibility for promoting responsible AI innovation and uptake. Risks related to chemical, biological, radiological, and nuclear weapons (and other dangerous weapons), including defining thresholds for harm in these domains, are managed by a combination of the Home Office, Foreign, Commonwealth and Development Office, Cabinet Office, and the Ministry of Defence. DSIT does not set thresholds for dangerous capabilities in risk domains owned by other departments.
The AI Security Institute (AISI), as part of DSIT, focuses on researching emerging AI risks with serious security implications, such as the potential for AI to help users develop chemical and biological weapons. AISI works with a broad range of experts and leading AI companies to understand the capabilities of advanced AI and advise on technical mitigations. AISI’s research supports other government departments in taking evidence-based action to mitigate risks whilst ensuring AI delivers on its potential for growth. AISI’s Frontier AI Trends Report, published in December 2025, outlines how frontier AI risks are expected to develop in the future.
The Government does not give a running commentary on models being tested or which models we have been granted access to due to commercial and security sensitivities. Where possible, given these sensitivities, the AI Security Institute aims to publish results.
We want to ensure that people have access to good, meaningful work. AI is already transforming workplaces, demanding new skills, and augmenting existing ones. Government is working to harness its benefits to boost growth, productivity, living standards, and worker wellbeing, while mitigating the risks.
The Department for Education published an analysis in 2023 outlining The impact of AI on UK jobs and training. We are currently considering our approach to updating this analysis.
Further to this, the Get Britain Working White Paper outlines how government will address labour market challenges and spread opportunity and economic prosperity that AI presents to the British public. This includes launching Skills England to create a shared national plan to boost the nation’s skills, creating more good jobs through our modern Industrial Strategy, and strengthening employment rights through DBT’s Plan to Make Work Pay.
DSIT has also published guidance for businesses adopting AI, focusing on good practice AI assurance when procuring and deploying AI systems. AI assurance could significantly manage risks and build trust, supporting business to assess and mitigate the potential impacts of AI adoption.
We want to ensure that people have access to good, meaningful work. AI will impact the labour market and Government is working to harness its benefits in terms of boosting growth, productivity, living standards, and worker wellbeing, while mitigating the risks. We’re planning for varied outcomes and monitoring data to track and prepare for these. The Get Britain Working White Paper sets out how we will address key challenges and that includes giving people the skills to get those jobs and spread opportunity to fix the foundations of our economy to seize AI’s potential.
The Government is supporting workforce readiness for AI through a range of initiatives. The new AI Skills Hub, developed by Innovate UK and PwC, provides streamlined access to digital training. This will support government priorities through tackling critical skills gaps and improving workforce readiness. We are also partnering with 11 major companies to train 7.5 million UK workers in essential AI skills by 2030.
Artificial intelligence is the defining opportunity of our generation, and the Government is taking action to harness its economic benefits for UK citizens. As set out in the AI Opportunities Action Plan, we believe most AI systems should be regulated at the point of use, with our expert regulators best placed to do so. Departments are working proactively with regulators to provide clear strategic direction and support them on their AI capability needs. Through well-designed and implemented regulation, we can fuel fast, wide and safe development and adoption of AI.
Artificial intelligence is the defining opportunity of our generation, and the Government is taking action to harness its economic benefits for UK citizens. As set out in the AI Opportunities Action Plan, we believe most AI systems should be regulated at the point of use, with our expert regulators best placed to do so. Departments are working proactively with regulators to provide clear strategic direction and support them on their AI capability needs. Through well-designed and implemented regulation, we can fuel fast, wide and safe development and adoption of AI.
This government recognises the vital role that charities play in providing crucial support to different groups and communities. The Civil Society Covenant sets out the terms of a new relationship between government and civil society, and is a clear statement that government sees civil society as an indispensable partner in building a better Britain.
DCMS is promoting the availability of funding for smaller charities in several ways. This includes delivery of a number of grant schemes, such as the Know Your Neighbourhood Fund and the £25.5 million Voluntary, Community, and Social Enterprise (VCSE) Energy Efficiency Scheme, which is supporting frontline organisations across England to improve their energy efficiency and sustainability.
Support for charities is also available through social investment which provides a range of tools – from grants to investments – to help charities and social enterprises grow their trading income, strengthen their resilience, and access financial support that works for them. The Dormant Assets Scheme Strategy, published in June 2025, announced that the Scheme is expected to release £440 million for England over 2024-28, with £87.5 million of this funding allocated towards social investment.
This government recognises the vital role that charitable organisations and community groups play in improving people’s health and wellbeing. These organisations, as well as the wider Voluntary, Community and Social Enterprise (VCSE) sector, are integral to the Government’s vision for national renewal and delivery of the five national missions.
DCMS supports VCSEs with their financial sustainability through a number of grant programmes, and supporting the growth of other sources of funding. The Government’s Social Enterprise Boost Fund is an up to £5.1 million package of funding to kickstart and accelerate social enterprise activity in four disadvantaged areas of England. We also provide support to charities through a range of tax reliefs and exemptions, with more than £6 billion in charitable reliefs provided to charities, Community Amateur Sports Clubs and their donors in 2023-24.
We also have the VCSE Health and Wellbeing Programme, which is a mechanism through which the Department of Health and Social Care, NHS England and UK Health Security Agency work together with VCSE organisations to drive transformation of health and care systems; promote equality; address health inequalities; and help people, families, and communities to achieve and maintain wellbeing. This will help the government to deliver on the Health Mission, and in particular the shift to prevention, through a cross-sector approach.
The Government recognises the importance of ensuring public access to leisure facilities which are great spaces for people of all ages to stay fit and healthy, and which play an important role within communities.
The ongoing responsibility of providing access to public leisure facilities lies at local authority level. We share your ambition to ensure that young people in Dewsbury get the opportunities to benefit from quality sport and physical activity opportunities. The Government encourages local authorities to make investments which offer the right opportunities and facilities for the communities they serve, investing in sport and physical activity with a place-based approach, to meet the needs of individual communities.
We recognise that grassroots facilities are at the heart of communities up and down the country and are acting to support more people to get active wherever they live. On 21 March we announced £100 million funding to be delivered through the Multi-Sport Grassroots Facilities Programme, supporting high-quality, inclusive facilities across the UK.
The Government recognises the importance of ensuring public access to leisure facilities which are great spaces for people of all ages to stay fit and healthy, and which play an important role within communities.
The ongoing responsibility of providing access to public leisure facilities lies at local authority level. We share your ambition to ensure that young people in Dewsbury get the opportunities to benefit from quality sport and physical activity opportunities. The Government encourages local authorities to make investments which offer the right opportunities and facilities for the communities they serve, investing in sport and physical activity with a place-based approach, to meet the needs of individual communities.
We recognise that grassroots facilities are at the heart of communities up and down the country and are acting to support more people to get active wherever they live. On 21 March we announced £100 million funding to be delivered through the Multi-Sport Grassroots Facilities Programme, supporting high-quality, inclusive facilities across the UK.
Following the sale of City and Guilds Ltd, we understand that organisation will continue to deliver qualifications within the further education sector and work constructively with providers as usual. As the regulator of qualifications, Ofqual has responsibility for ensuring that recognised awarding organisations meet their obligations on qualifications quality and public confidence. We understand that Ofqual also monitors qualifications prices and publishes this data annually.
This government is clear that off-rolling in any form is unacceptable, and we will continue to work closely with Ofsted to tackle it.
Pupils may leave a school roll for many reasons, including permanent exclusion, transfer to another school or change of circumstances. All schools are legally required to notify the local authority when a pupil’s name is removed from the admissions register.
The law is clear a pupil’s name can only be deleted from the admission register on the grounds prescribed in Regulation 9 of the School Attendance (Pupil Registration) (England) Regulations 2024.
My right hon. Friend the Secretary of State for Education and I continue to engage with special educational needs and disabilities charities, stakeholders and parents and carers on a wide variety of issues, including through weekly engagement sessions via webinars, meetings and visits. We also conduct roundtables with charities and campaigners, the most recent of which was in June.
These engagements will carry on throughout the White Paper consultation period into the autumn and beyond.
The government has committed to enhancing the capability of mainstream schools to better support pupils with special educational needs and disabilities (SEND).
We are encouraged by emerging examples of effective collaboration, where special schools are working in partnership with mainstream settings to share specialist expertise.
Through our Change Programme, we are currently piloting approaches whereby alternative provision settings provide outreach support to mainstream schools. The insights gained from these pilots will inform future policy development and help shape sustainable, effective partnerships between mainstream schools and specialist SEND providers.
The department collects information from local authorities on the number of requests for an education, health and care (EHC) needs assessment, the number of EHC needs assessments carried out and the number of EHC plans issued on a calendar year basis. The latest figures we hold relate to the 2023 calendar year. Information for the 2024 calendar year will be published on 26 June.
The number of requests for an EHC needs assessment, the number of EHC needs assessments and the number of EHC plans issued within the statutory timeframe of 20 weeks from the date of the request for EHC needs assessment is given for Kirklees local authority in the table available here: https://explore-education-statistics.service.gov.uk/data-tables/permalink/2a676326-624e-4d03-96c7-08dd85738b16.
Payment of fines is ultimately a matter for the regulator.
Across all our reforms the goal is to deliver our key outcomes – environment, customers, investability – in the most effective and efficient way possible to ensure lasting value.