Driving innovation that will deliver improved public services, create new better-paid jobs and grow the economy.
Oral Answers to Questions is a regularly scheduled appearance where the Secretary of State and junior minister will answer at the Dispatch Box questions from backbench MPs
Other Commons Chamber appearances can be:Westminster Hall debates are performed in response to backbench MPs or e-petitions asking for a Minister to address a detailed issue
Written Statements are made when a current event is not sufficiently significant to require an Oral Statement, but the House is required to be informed.
Department for Science, Innovation & Technology does not have Bills currently before Parliament
A bill to make provision about access to customer data and business data; to make provision about services consisting of the use of information to ascertain and verify facts about individuals; to make provision about the recording and sharing, and keeping of registers, of information relating to apparatus in streets; to make provision about the keeping and maintenance of registers of births and deaths; to make provision for the regulation of the processing of information relating to identified or identifiable living individuals; to make provision about privacy and electronic communications; to establish the Information Commission; to make provision about information standards for health and social care; to make provision about the grant of smart meter communication licences; to make provision about the disclosure of information to improve public service delivery; to make provision about the retention of information by providers of internet services in connection with investigations into child deaths; to make provision about providing information for purposes related to the carrying out of independent research into online safety matters; to make provision about the retention of biometric data; to make provision about services for the provision of electronic signatures, electronic seals and other trust services; to make provision about the creation and solicitation of purported intimate images and for connected purposes.
This Bill received Royal Assent on 19th June 2025 and was enacted into law.
e-Petitions are administered by Parliament and allow members of the public to express support for a particular issue.
If an e-petition reaches 10,000 signatures the Government will issue a written response.
If an e-petition reaches 100,000 signatures the petition becomes eligible for a Parliamentary debate (usually Monday 4.30pm in Westminster Hall).
We want the Government to repeal the Online Safety act.
Introduce 16 as the minimum age for children to have social media
Gov Responded - 17 Dec 2024 Debated on - 24 Feb 2025We believe social media companies should be banned from letting children under 16 create social media accounts.
The Government recognises that the safe, reliable and accountable use of artificial intelligence is important to maintaining public trust in public services.
Departments deploying AI systems are expected to consider risks and impacts throughout the system lifecycle, including during design, development, deployment and operation. This includes compliance with safety, transparency, accountability, data protection rules and regulations.
The Government has published guidance to support this, including the Data and AI Ethics Framework, the AI Playbook for Government and the AI Knowledge Hub, which together provide advice on governance, risk management, testing and oversight.
In addition, the Department for Science, Innovation and Technology has published guidance on AI assurance, and a cross‑government AI Testing and Assurance Framework supports proportionate testing, evaluation and ongoing monitoring.
AI‑enabled services are also expected to meet the GOV.UK Service Standard, including demonstrating that they are safe, secure, reliable and well‑governed.
The Government prioritised the commencement of the Competition and Markets Authority’s (CMA) new powers in digital markets last year to boost competition and fairness in the digital tech sector. Although the CMA operates independently of Government, the Government gave a clear steer for the CMA to use these new powers collaboratively and proportionately.
In March, the CMA announced a package of actions to strengthen competition in business software and cloud services. This includes a Strategic Market Status investigation into Microsoft’s business software under the UK’s digital markets regime, alongside voluntary actions from Amazon and Microsoft that will improve interoperability, reduce data egress fees and make switching easier in cloud services. Taken together, these steps aim to address identified concerns and support a more competitive, resilient cloud market in the UK.
The Government prioritised the commencement of the Competition and Markets Authority’s (CMA) new powers in digital markets last year to boost competition and fairness in the digital tech sector. Although the CMA operates independently of Government, the Government gave a clear steer for the CMA to use these new powers collaboratively and proportionately.
In March, the CMA announced a package of actions to strengthen competition in business software and cloud services. This includes a Strategic Market Status investigation into Microsoft’s business software under the UK’s digital markets regime, alongside voluntary actions from Amazon and Microsoft that will improve interoperability, reduce data egress fees and make switching easier in cloud services.
The ‘Childhood in the Age of AI’ summit will be attended by a diverse group of representatives from civil society, industry, government and representatives of young people. It will address the impacts of AI on children and young people across a wide range of domains, such as education, wellbeing, development and safety. The discussions will not be restricted to any age group.
This work forms part of the government’s work to hear directly from parents and young people across the UK through our National Conversation children’s and young people’s wellbeing online.
Data centres are foundational infrastructure for a modern, competitive UK economy, enabling the digital services that underpin productivity across numerous sector, from financial services and advanced manufacturing to public services and the creative industries. By enabling artificial intelligence, cloud computing and data intensive services, data centres generate productivity gains across the wider economy and reinforce the UK’s attractiveness as a crucial destination for investment.
Tech UK has estimated that UK data centres contribute £4.7 billion pounds in gross value added each year and support-tens of thousands of high-quality jobs across construction, operations and specialist supply chains. Operational employment is generally highly skilled and well paid, with wider employment supported through demand for electrical engineering, cooling, digital infrastructure and maintenance services.
HMG’s AI Growth Zone programme will unlock significant private investment and secure compute to drive AI growth, supporting high‑value local jobs and skills. HMG will also invest up to £5 million per Growth Zone, working with local areas to design tailored schemes to realise local economic benefits and boost AI adoption in local communities.
The Government prioritised the commencement of the Competition and Markets Authority’s (CMA) new powers in digital markets last year to boost competition and fairness in the digital tech sector. Although the CMA operates independently of Government, the Government gave a clear steer for the CMA to use these new powers collaboratively and proportionately.
In March, the CMA announced a package of actions to strengthen competition in business software and cloud services. This includes a Strategic Market Status investigation into Microsoft’s business software under the UK’s digital markets regime, alongside voluntary actions from Amazon and Microsoft that will improve interoperability, reduce data egress fees and make switching easier in cloud services.
The Government Office for Science commissioned an evidence review by an external academic group to synthesise the latest published literature on misinformation. It is a pre-registered study looking at the existing published evidence and is not therefore seeking direct contributions from organisations. The review will be published in due course. The findings from this desktop exercise will inform government's thinking on identifying and tackling harmful misinformation.
All organisations processing personal data in the UK must comply with the UK’s data protection framework, including the UK GDPR, regardless of where they are headquartered. This includes requirements that apply when personal data is transferred overseas, and organisations must ensure that appropriate safeguards are in place where required.
The UK has world-leading investigation and enforcement capabilities to ensure that data is collected and handled responsibly and securely. The Information Commissioner’s Office has powers to investigate, issue fines and require corrective action where organisations fail to comply with the UK’s data protection framework, and individuals may seek redress if their data is misused.
As threats to UK data evolve our response will be agile and proportionate. We actively monitor threats to UK data and will not hesitate to take further action if necessary to protect our national security.
AI has huge potential benefits, but can also bring new risks, including new opportunities for criminals. The OSA lists fraud as a priority offence and regulates AI-generated media in the same way as ‘real’ content, placing the same obligations on services to protect users.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. This would include, where appropriate, the use of emerging technologies to stifle criminal abuse of networks. To support compliance, Ofcom issues Codes of Practice advising services on how to be compliant with their regulatory obligations. We expect these Codes to evolve over time to include new technologies.
AI has huge potential benefits, but can also bring new risks, including new opportunities for criminals. The OSA lists fraud as a priority offence and regulates AI-generated media in the same way as ‘real’ content, placing the same obligations on services to protect users.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. This would include, where appropriate, the use of emerging technologies to stifle criminal abuse of networks. To support compliance, Ofcom issues Codes of Practice advising services on how to be compliant with their regulatory obligations. We expect these Codes to evolve over time to include new technologies.
AI has huge potential benefits, but can also bring new risks, including new opportunities for criminals. The OSA lists fraud as a priority offence and regulates AI-generated media in the same way as ‘real’ content, placing the same obligations on services to protect users.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. This would include, where appropriate, the use of emerging technologies to stifle criminal abuse of networks. To support compliance, Ofcom issues Codes of Practice advising services on how to be compliant with their regulatory obligations. We expect these Codes to evolve over time to include new technologies.
The Department for Science, Innovation and Technology does not hold data relating to the number of fraudulent or scam adverts on social media or other regulated services.
There are mechanisms in the Online Safety Act that allow Ofcom to collect information from categorised services on the incidence and dissemination of illegal content, which would include fraudulent advertising content. Ofcom is required under the Act to publish annual transparency reports.
The Department for Science, Innovation and Technology does not hold data relating to the number of fraudulent or scam adverts on social media or other regulated services.
There are mechanisms in the Online Safety Act that allow Ofcom to collect information from categorised services on the incidence and dissemination of illegal content, which would include fraudulent advertising content. Ofcom is required under the Act to publish annual transparency reports.
The Department for Science, Innovation and Technology does not hold data relating to the number of fraudulent or scam adverts on social media or other regulated services.
There are mechanisms in the Online Safety Act that allow Ofcom to collect information from categorised services on the incidence and dissemination of illegal content, which would include fraudulent advertising content. Ofcom is required under the Act to publish annual transparency reports.
The Department for Science, Innovation and Technology does not hold data relating to the number of fraudulent or scam adverts on social media or other regulated services.
There are mechanisms in the Online Safety Act that allow Ofcom to collect information from categorised services on the incidence and dissemination of illegal content, which would include fraudulent advertising content. Ofcom is required under the Act to publish annual transparency reports.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. Ofcom, the independent regulator, has robust powers to act where services are failing in these responsibilities.
Measures under the OSA to specifically tackle fraudulent advertising are still being implemented. In the summer, Ofcom aim to publish a register of categorised services and to launch a consultation on additional duties for those designated as Category 1 or 2A to tackle paid-for fraudulent advertising.
The Online Safety Act (OSA) lists certain fraud offences as ‘priority offences’, meaning regulated services must prevent users encountering fraudulent content, swiftly remove it if it appears, and mitigate and manage the risk of their services facilitating fraud. Ofcom, the independent regulator, has robust powers to act where services are failing in these responsibilities.
Measures under the OSA to specifically tackle fraudulent advertising are still being implemented. In the summer, Ofcom aim to publish a register of categorised services and to launch a consultation on additional duties for those designated as Category 1 or 2A to tackle paid-for fraudulent advertising.
The Government recognises that AI is transforming workplaces, demanding new skills and augmenting existing roles. We have launched the AI and the Future of Work Unit - a cross‑government function dedicated to ensuring AI delivers positive outcomes for the economy, jobs, and workers. We are preparing for a range of possible futures to ensure this transformation boosts productivity and opportunities and the Government launched an assessment of AI impacts on the labour markets in January 2026.
To build a digitally skilled workforce to support long-term economic growth, drive innovation and expand individual opportunity we are supporting AI Skills Boost to upskill 10 million workers in AI skills by 2030. We have already delivered more than 1 million AI training courses have been delivered to workers across the UK.
The Government recognises the importance of memory chips to our economy and critical sectors. We regularly engage with industry to monitor supply chain vulnerabilities and understand potential risks across all chip types. Given the global nature of semiconductor supply chains, the UK is working closely with international partners bilaterally and through multilateral fora – such as the G7 and OECD - to strengthen collective resilience, improve information‑sharing, and develop coordinated approaches to supply chain challenges.
The Government keeps the impacts of data protection legislation under review. As set out in the answer of 20 March 2026 to Question 120026, there is currently no definitive empirical study that isolates the specific, UK‑wide impact of the UK GDPR on productivity since its adoption.
The UK’s data protection framework has been updated through the Data (Use and Access) Act, which makes targeted changes to the UK GDPR and related legislation to make the regime clearer, more proportionate and better suited to supporting responsible data‑driven innovation, while maintaining high standards of protection for individuals. In this context, the Government’s focus is on evaluating the impacts of the UK’s data protection framework as it now operates, including the reforms introduced by the Data (Use and Access) Act.
We are committed to building the evidence base on how our data protection and wider data legislation affects businesses, consumers and the economy, including productivity, as part of our ongoing programme of monitoring and evaluation.
We are working with UKRI, universities, and other partners to ensure the safe and responsible adoption of AI tools while protecting research integrity.
Our AI for Science Strategy recognises that the integration of AI into research holds potential to be the single most impactful application of the technology, setting out 15 actions that will support UK researchers. That will include the provision of compute through the AI Research Resource; delivery of training and upskilling in AI methods; the creation, curation, and scaling of AI-ready datasets; developing access models for AI tools; developing autonomous lab infrastructure, and supporting research into the impacts of AI on the scientific process.
Additionally, the National Data Library will support the foundations for AI-enabled research by improving access to high-quality public sector data, alongside recently published guidance to help public bodies make datasets AI-ready.
The Government is committed to ensuring that any risks from the industry-led migration of the copper based Public Switched Telephone Network (PSTN) to Voice over Internet Protocol (VoIP) are mitigated for everyone across the UK, including rural communities. In 2024/25, there were over 2,600 major incidents on the PSTN, each affecting 500 or more customers.
In November 2024, the Government secured additional safeguards from the telecoms industry. These include the provision of free battery back-ups for vulnerable and landline dependent customers to ensure access to emergency services for at least one hour in a power outage. Many communication providers have gone further, providing battery back-ups of 4-7 hours.
In March 2026, the Government and industry agreed a new Fixed Telecoms Charter to extend these safeguards to all future fixed telecoms modernisation programmes.
The Department for Science, Innovation and Technology’s first set of accounts were for 2023/24 where the expenditure on special severance payments was £99,390. Expenditure in subsequent years can be found in the relevant annual report and accounts.
The Online Safety Act requires platforms to tackle illegal content and protect children from harmful content, including that which is hateful and abusive. For large user-to-user platforms, known as ‘Category 1’ services, it will also provide adult users with more protections from hate speech by offering them more choice over the types of content they engage with, filter content from non-verified accounts and hold platforms to account for their terms of service. Ofcom have robust enforcement powers to enforce these duties.
No assessment has been made of the of the consistency between the number of beagles licensed for use in scientific experiments approved by the Home Office between January and December 2025 and the Government's Replacing Animals in Science strategy. The Labour Manifesto commits to partnering with scientists, industry and civil society as we work towards the phasing out of animal testing. It is not yet possible to replace all animal use due to the complexity of biological systems and regulatory requirements for their use. Any work to phase out animal testing must be science-led, in lock step with partners.
All organisations processing personal data in the UK must comply with the UK’s data protection framework.
The UK has strong safeguards to ensure that data is collected and handled responsibly and securely. Companies registered in the UK are subject to our legal framework and regulatory jurisdiction. Personal data transfers abroad are subject to a high level of legal protection. Failure to comply can result in enforcement action.
As threats to UK data evolve our response will be agile and proportionate. We actively monitor threats to UK data and will not hesitate to take further action if necessary to protect our national security.
The Online Safety Act lists fraud as a priority offence, meaning that in-scope services must now prevent and minimise user-generated fraud content from appearing on their platforms, and swiftly remove it if it does.
Services designated by Ofcom as Category 1 and 2A (large user-to-user and large search services respectively) will have additional duties to tackle paid-for fraudulent advertising. Ofcom aims to publish its categorisation register, and to consult on the additional duties for categorised services – including on fraudulent advertising - around July 2026.
The Department for Science, Innovation and Technology (DSIT) has committed a record £58.5 billion investment in R&D over the next 4 years. This includes £38.6 billion allocated to UKRI. The overall Government spend on R&D over the next 4 years is £86 billion.
The Science and Technology Facilities Council (STFC) within UKRI has a flat budget across this period and is currently working with the sector to model different spending scenarios for its overall portfolio including in particle physics, astronomy and nuclear physics (PPAN). The impacts of different modelled scenarios across the broad and diverse range of STFC-funded facilities and programmes will be considered alongside feedback from the sector when taking final decisions. The current level of post-doctoral researchers and flow of PhD students will be maintained across the SR period.
DSIT has asked UKRI to ensure that its specific investment decisions are informed by meaningful engagement with the scientific research community and a robust assessment of potential consequences for the UK’s scientific capability, research institutions and international standing.
The Information Commissioner’s Office have seen the average days to resolve or close an FOI complaint reduce over the past five years from 134 days in 2021/22 to 76 days in 2025/26 despite cases increasing from 5932 to 8337 over the same period. The ICO are now publishing this information on a monthly basis on their website.
The Government recognises the importance of safeguarding the UK’s research and innovation ecosystem, including the university spinout sector, from risks associated with foreign ownership, influence, or investment. The government will not hesitate to use our powers to protect national security wherever we identify concerns and we have a range of effective measures in place to do so.
The Government is actively protecting the UK’s research and spinout ecosystem from national security risks. The National Protective Security Authority (NPSA), working with the National Cyber Security Centre (NCSC), supports universities and spinouts through the Secure Innovation programme, providing advice on due diligence, investment screening and managing security risks. Targeted Secure Innovation Security Reviews further help early‑stage firms identify and mitigate vulnerabilities linked to foreign engagement.
The Government has powers under the National Security and Investment (NSI) Act 2021 to review and, where required, intervene in investments that may pose a risk to national security. The Government also monitors the market at all times to identify acquisitions of potential national security interest.
The Government recognises the importance of safeguarding the UK’s research and innovation ecosystem, including the university spinout sector, from risks associated with foreign ownership, influence, or investment. The government will not hesitate to use our powers to protect national security wherever we identify concerns and we have a range of effective measures in place to do so.
The Government is actively protecting the UK’s research and spinout ecosystem from national security risks. The National Protective Security Authority (NPSA), working with the National Cyber Security Centre (NCSC), supports universities and spinouts through the Secure Innovation programme, providing advice on due diligence, investment screening and managing security risks. Targeted Secure Innovation Security Reviews further help early‑stage firms identify and mitigate vulnerabilities linked to foreign engagement.
The Government has powers under the National Security and Investment (NSI) Act 2021 to review and, where required, intervene in investments that may pose a risk to national security. The Government also monitors the market at all times to identify acquisitions of potential national security interest.
The Government recognises the importance of safeguarding the UK’s research and innovation ecosystem, including the university spinout sector, from risks associated with foreign ownership, influence, or investment. The government will not hesitate to use our powers to protect national security wherever we identify concerns and we have a range of effective measures in place to do so.
The Government is actively protecting the UK’s research and spinout ecosystem from national security risks. The National Protective Security Authority (NPSA), working with the National Cyber Security Centre (NCSC), supports universities and spinouts through the Secure Innovation programme, providing advice on due diligence, investment screening and managing security risks. Targeted Secure Innovation Security Reviews further help early‑stage firms identify and mitigate vulnerabilities linked to foreign engagement.
The Government has powers under the National Security and Investment (NSI) Act 2021 to review and, where required, intervene in investments that may pose a risk to national security. The Government also monitors the market at all times to identify acquisitions of potential national security interest.
The Government recognises the importance of safeguarding the UK’s research and innovation ecosystem, including the university spinout sector, from risks associated with foreign ownership, influence, or investment. The government will not hesitate to use our powers to protect national security wherever we identify concerns and we have a range of effective measures in place to do so.
The Government is actively protecting the UK’s research and spinout ecosystem from national security risks. The National Protective Security Authority (NPSA), working with the National Cyber Security Centre (NCSC), supports universities and spinouts through the Secure Innovation programme, providing advice on due diligence, investment screening and managing security risks. Targeted Secure Innovation Security Reviews further help early‑stage firms identify and mitigate vulnerabilities linked to foreign engagement.
The Government has powers under the National Security and Investment (NSI) Act 2021 to review and, where required, intervene in investments that may pose a risk to national security. The Government also monitors the market at all times to identify acquisitions of potential national security interest.
The Department for Science, Innovation and Technology has not made a formal assessment to date of the extent to which public procurement frameworks allow the NHS or the Ministry of Defence to support the development and adoption of UK produced AI.
However, the Government is actively looking at this through a cross government ministerial working group bringing together DSIT, the Department of Health and Social Care and the Ministry of Defence, which is exploring how government works with innovative UK companies, including in the AI sector. Alongside this, the Government’s wider approach is to use public procurement to make the public sector a first customer for innovative technologies and a launchpad for scale ups, supported by Cabinet Office led social value reforms and work through the Commercial Innovation Hub.
As of 16 March 2026, the GOV.UK App has an estimated total of over 230,000 active users. Analytics tracking captures only those who opt in, so this figure is higher than the number of users providing consent. To date, approximately 135,000 users have consented to analytics tracking, averaging around 23,000 consented users per month.
While the Government has not set formal numerical targets for 2026–27, the strategic aim is to drive sustained growth by making the GOV.UK App the most convenient and trusted way for people to access government services. Growth is expected as new features and services are introduced, alongside improvements in personalisation and ongoing focus on user needs, in line with the Government Digital Service’s roadmap for modern digital government.
The Government is also committed to addressing digital exclusion. The GOV.UK App has been designed to be simple and accessible, informed by user research conducted during its public beta and in line with GOV.UK accessibility standards. Alongside this, the Government will continue to assess the digital skills support needed, including understanding barriers faced by digitally excluded groups and working with departments, local authorities and delivery partners to provide assisted digital support and signposting to digital skills training. Services will continue to be available through multiple channels, ensuring that those who are unable to use digital services can still access government support.
Matters regarding specific delivery and commercial plans for any private project are for the lead private sector investor to confirm. The government engages regularly with the sector to support build out.
CoreWeave's announced investments into the UK total £2.5 billion. CoreWeave has committed £1.5 billion towards the Lanarkshire AI Growth Zone in Scotland, deploying cutting-edge semiconductors at DataVita's data centre campus in Lanarkshire. The earlier £1 billion investment covered the opening of CoreWeave's UK office as its European headquarters, the creation of job opportunities across engineering, operations, and finance, and the deployment of AI computing infrastructure across two data centres in Crawley and London Docklands.
Large AI infrastructure investments are complex and take time to deliver; as government, we want to encourage these investments by supporting them as best we can. Where important investment announcements and commitments are made, Government will continue to work closely with those companies to ensure the delivery of those investments.
The Government recognises that AI is transforming workplaces, demanding new skills and augmenting existing roles. We have launched the AI and the Future of Work Unit - a cross‑government function dedicated to ensuring AI delivers positive outcomes for the economy, jobs, and workers. We are preparing for a range of possible futures to ensure this transformation boosts productivity and opportunities and the Government launched an assessment of AI impacts on the labour markets in January 2026.
To build a digitally skilled workforce to support long-term economic growth, drive innovation and expand individual opportunity we are supporting AI Skills Boost to upskill 10 million workers in AI skills by 2030. We have already delivered more than 1 million AI training courses have been delivered to workers across the UK.
Building on the Future of Work Unit, the Chancellor announced a new AI Economics Institute in her recent Mais Lecture. This joint HMT-DSIT institute will incorporate the FoW Unit, as part of a broader focus on the economics of AI, including labour market, productivity and other impacts.
The Government recognises the importance of the mathematical sciences. While delivery plans and funding allocations are prepared, the Engineering and Physical Sciences Research Council (EPSRC) which is part of UK Research and Innovation (UKRI), has made no additional commitments beyond existing planned investments, as set out in the response to HL14784.
The Digital Inclusion Innovation Fund is about testing new ideas, learning what works, and supporting the best approaches so they can grow and benefit more communities across the UK. The Fund received 1016 applications from organisations across the country, amounting to a total request of over £170m for the £11.9m available.
Payment-in-arrears is the standard Government approach for grants. However, we recognise some stakeholders were concerned about payments-in-arrears and the short delivery window of the Fund. These issues are considerations we are taking forward as we continue policy development in this area.
Despite this, projects are continuing to deliver important outcomes for the people they support, such as supporting people to access the internet and building their digital skills.
We have appointed external evaluators who are working with grant recipients to understand the impact of the Fund. This will also involve assessing the process, including grant management and deliverability within the timescale.
We expect to receive their report in April 2026.
The GOV.UK app is in public beta with expenditure met from within the overall budgets of the Government Digital Service (GDS) as part of the wider GOV.UK modernisation activity.
In 25/26 c.£6.2m has been attributed to GOV.UK app and related programme of personalisation and modernisation - this relates to spend on design, build, test and running. There has been no significant spend on marketing of the app, with less than £2k related to reaching private beta testing audiences.
We know that digital inclusion works best when it's delivered in local places by trusted people and organisations. The Digital Inclusion Innovation Fund is about backing local communities to close the digital divide, and grassroots organisations are fundamental to that process.
The Digital Inclusion Innovation Fund had 85 successful applications in England: a mix of charities, research organisations and local and combined authorities.
Around 73% of the organisations funded by the Digital Inclusion Innovation Fund are charities, many of which are local, grassroots voluntary organisations. We don't hold specific data on the annual income of organisations.
The Digital Inclusion Innovation Fund was designed as a one-year programme to understand what works in digital inclusion, and how best practice or innovative approaches can be scaled to maximise local impact across the UK.
We remain committed to building a digitally inclusive society where no one is left behind, and plans for future support for digital inclusion are still in development.
His Majesty’s Government continues to take a careful and evidence led approach to exploring the potential role of large language models in supporting departments to respond to enquiries from members of the public.
I refer the noble Lord to the answer I gave to question HL15270 on 18 March 2026.
No consultation on regulations to be made under section 154A of the Online Safety Act has yet been published.
The Department for Science, Innovation and Technology is continuing to work with Ofcom, UKRI, researchers, and service providers to design a framework to provide a means for researchers to access the invaluable data held by tech companies for the purposes of online safety research.
We will provide an update in due course.
The Online Safety Act lists fraud as priority illegal content, meaning in-scope services including social media and search providers must prevent and minimise fraudulent user-generated content from appearing on their services and swiftly remove it if it does. In-scope user-to-user services must also manage the risk that their service may be used to facilitate fraud offences.
Category 1 and 2A services (including large social media and search providers respectively) will have additional duties to tackle paid-for fraudulent advertising. Ofcom is responsible for designating categorised services and aims to publish the categorisation register in July.
The Online Safety Act lists fraud as priority illegal content, meaning in-scope services including social media and search providers must prevent and minimise fraudulent user-generated content from appearing on their services and swiftly remove it if it does. In-scope user-to-user services must also manage the risk that their service may be used to facilitate fraud offences.
Category 1 and 2A services (including large social media and search providers respectively) will have additional duties to tackle paid-for fraudulent advertising. Ofcom is responsible for designating categorised services and aims to publish the categorisation register in July.
Ministerial private offices within the Department for Science, Innovation and Technology are resourced flexibly to meet business needs, and the size of individual offices varies.
Staff are appointed across a range of grades EO-SCS1.
Remuneration is in line with the Department’s published pay scales for each grade. Contracted working hours are typically 37 hours per week.
Staff turnover rates specific to ministerial private offices are not calculated.
The total number of staff currently working in ministerial private offices in the Department is 35.
An allowance of up to 18% of base salary is available to staff in private offices who meet the relevant eligibility criteria.
DSIT does not lead defence or security cooperation with Ukraine, which is driven by other government departments under the 100 Year Partnership. DSIT is supporting Ukrainian and UK researchers and businesses through UK Research and Innovation (UKRI) grants and Horizon Europe funding, which offer routes for scientific exchange.