To match an exact phrase, use quotation marks around the search term. eg. "Parliamentary Estate". Use "OR" or "AND" as link words to form more complex queries.


View sample alert

Keep yourself up-to-date with the latest developments by exploring our subscription options to receive notifications direct to your inbox

Written Question
Media Literacy Task Force
Friday 1st March 2024

Asked by: Dan Jarvis (Labour - Barnsley Central)

Question to the Department for Science, Innovation & Technology:

To ask the Secretary of State for Science, Innovation and Technology, if her Department will publish the recent work of the Media Literacy Taskforce.

Answered by Saqib Bhatti - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)

The Media Literacy Taskforce, a body of 17 media literacy experts drawn from the tech industry, civil society, the press sector and academia, was established in March 2022. DSIT consults the Taskforce on how to tackle the key challenges facing the media literacy landscape, in particular that of how to improve provision for citizens who are disengaged or lack access to support.

For example, the Taskforce played a key role in launching the Media Literacy Taskforce Fund, a grant scheme through which we awarded over £800,000 to four innovative media literacy projects delivered over the financial years 2022/23 and 2023/24. These projects seek to build the online safety and critical thinking skills of users from all age groups, empowering them to respond effectively to the threats posed by mis- and disinformation, along with other online harms. Taskforce members advised the government on which projects should be awarded funding, and then helped grant recipients to maximise the impact of their projects.

The Taskforce does not produce its own reports or other written materials for publication. However, Government has committed to publishing annual Action Plans until the end of Financial Year 2024/2025, setting out initiatives to meet the Online Media Literacy Strategy’s ambition. All projects funded in relation to the Strategy are evaluated robustly and findings will be published on gov.uk, improving the effectiveness and efficiency of future media literacy initiatives and informing government policy moving forward.


Written Question
Local Broadcasting
Tuesday 27th February 2024

Asked by: Alex Norris (Labour (Co-op) - Nottingham North)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Culture, Media and Sport, what assessment her Department has made of the adequacy of the viability of local media.

Answered by Julia Lopez - Minister of State (Department for Science, Innovation and Technology)

The government is committed to supporting local and regional newspapers as pillars of communities and local democracy. They play an essential role in holding power to account, keeping the public informed of local issues and providing reliable, high-quality information.

The government is disappointed to see that Meta is closing its Community News Project. We are working to support journalism and local newsrooms to ensure the sustainability of this vital industry.

We are introducing a new, pro-competition regime for digital markets. The regime, which aims to address the far-reaching power of the biggest tech firms, will help rebalance the relationship between publishers and the online platforms on which they increasingly rely. This will make an important contribution to the sustainability of the press.

Additionally, our support for the sector has included the delivery of a £2 million Future News Fund, the extension of a 2017 business rates relief on local newspaper office space until 2025; the publication of the Online Media Literacy Strategy; and the BBC also supports the sector directly, through the £8m it spends each year on the Local News Partnership, including the Local Democracy Reporting Scheme.


Written Question
Local Broadcasting
Tuesday 27th February 2024

Asked by: Alex Norris (Labour (Co-op) - Nottingham North)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Culture, Media and Sport, what support her Department provides to local media.

Answered by Julia Lopez - Minister of State (Department for Science, Innovation and Technology)

The government is committed to supporting local and regional newspapers as pillars of communities and local democracy. They play an essential role in holding power to account, keeping the public informed of local issues and providing reliable, high-quality information.

The government is disappointed to see that Meta is closing its Community News Project. We are working to support journalism and local newsrooms to ensure the sustainability of this vital industry.

We are introducing a new, pro-competition regime for digital markets. The regime, which aims to address the far-reaching power of the biggest tech firms, will help rebalance the relationship between publishers and the online platforms on which they increasingly rely. This will make an important contribution to the sustainability of the press.

Additionally, our support for the sector has included the delivery of a £2 million Future News Fund, the extension of a 2017 business rates relief on local newspaper office space until 2025; the publication of the Online Media Literacy Strategy; and the BBC also supports the sector directly, through the £8m it spends each year on the Local News Partnership, including the Local Democracy Reporting Scheme.


Written Question
Community News Project: Finance
Tuesday 27th February 2024

Asked by: Alex Norris (Labour (Co-op) - Nottingham North)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Culture, Media and Sport, whether she has made an assessment of the potential impact of Meta ending funding for the Community News Project on local journalism.

Answered by Julia Lopez - Minister of State (Department for Science, Innovation and Technology)

The government is committed to supporting local and regional newspapers as pillars of communities and local democracy. They play an essential role in holding power to account, keeping the public informed of local issues and providing reliable, high-quality information.

The government is disappointed to see that Meta is closing its Community News Project. We are working to support journalism and local newsrooms to ensure the sustainability of this vital industry.

We are introducing a new, pro-competition regime for digital markets. The regime, which aims to address the far-reaching power of the biggest tech firms, will help rebalance the relationship between publishers and the online platforms on which they increasingly rely. This will make an important contribution to the sustainability of the press.

Additionally, our support for the sector has included the delivery of a £2 million Future News Fund, the extension of a 2017 business rates relief on local newspaper office space until 2025; the publication of the Online Media Literacy Strategy; and the BBC also supports the sector directly, through the £8m it spends each year on the Local News Partnership, including the Local Democracy Reporting Scheme.


Written Question
Community News Project: Finance
Tuesday 27th February 2024

Asked by: Alex Norris (Labour (Co-op) - Nottingham North)

Question to the Department for Digital, Culture, Media & Sport:

To ask the Secretary of State for Culture, Media and Sport, whether she has had discussions with Meta on funding for the Community News Project.

Answered by Julia Lopez - Minister of State (Department for Science, Innovation and Technology)

The government is committed to supporting local and regional newspapers as pillars of communities and local democracy. They play an essential role in holding power to account, keeping the public informed of local issues and providing reliable, high-quality information.

The government is disappointed to see that Meta is closing its Community News Project. We are working to support journalism and local newsrooms to ensure the sustainability of this vital industry.

We are introducing a new, pro-competition regime for digital markets. The regime, which aims to address the far-reaching power of the biggest tech firms, will help rebalance the relationship between publishers and the online platforms on which they increasingly rely. This will make an important contribution to the sustainability of the press.

Additionally, our support for the sector has included the delivery of a £2 million Future News Fund, the extension of a 2017 business rates relief on local newspaper office space until 2025; the publication of the Online Media Literacy Strategy; and the BBC also supports the sector directly, through the £8m it spends each year on the Local News Partnership, including the Local Democracy Reporting Scheme.


Written Question
Social Media: Disinformation
Tuesday 13th February 2024

Asked by: Robert Buckland (Conservative - South Swindon)

Question to the Department for Science, Innovation & Technology:

To ask the Secretary of State for Science, Innovation and Technology, what steps her Department is taking to help tackle digital astroturfing on social media.

Answered by Saqib Bhatti - Parliamentary Under Secretary of State (Department for Science, Innovation and Technology)

The Government recognises the range of tactics which could be employed to spread mis- and disinformation and the threat that these can pose. DSIT’s National Security Online Information Team (NSOIT) analyses coordinated attempts to artificially manipulate the online information environment, working with a range of partners, including social media platforms, civil society groups, academia, and international partners, to tackle it.

Digital astroturfing, amongst other techniques sometimes used by state actors to interfere with UK society, will be captured by the Foreign Interference Offence. This has been added as a priority offence in the Online Safety Act and will capture a wide range of state-sponsored disinformation and state-backed operations. Companies will have a legal duty to take proactive action to prevent users from encountering material that amounts to an offence of Foreign Interference, which could include content linked to digital astroturfing, and minimise how long any such content is present on their services.

Under the Act, Ofcom’s Disinformation Advisory Committee is empowered to conduct research and build understanding on mis- and disinformation related issues, which may include the threats posed by digital astroturfing. In addition, Ofcom’s updated statutory duty to promote media literacy includes specific duties to raise the public’s awareness of how to keep themselves and others safe online, including by understanding the nature and impact of mis- and disinformation. This could include initiatives related to specific malicious tactics.


Written Question
Defending Democracy Taskforce: Disinformation
Thursday 8th February 2024

Asked by: Dan Jarvis (Labour - Barnsley Central)

Question to the Home Office:

To ask the Secretary of State for the Home Department, what steps the Defending Democracy Taskforce is taking to help reduce disinformation at the next general election.

Answered by Tom Tugendhat - Minister of State (Home Office) (Security)

The Government is committed to safeguarding the UK’s elections and already has established systems and processes in place, to protect the democratic integrity of the UK.

DSIT is the lead department on artificial intelligence and is part of the Defending Democracy Taskforce which has a mandate to safeguard our democratic institutions and processes from the full range of threats. The Taskforce ensures we have a robust system in place to rapidly respond to any threats during election periods.

Furthermore, the Online Safety Act places new requirements on social media platforms to swiftly remove illegal misinformation and disinformation - including artificial intelligence-generated deepfakes - as soon as they become aware of it. The Act also updates Ofcom’s statutory media literacy duty to require it to take tangible steps to prioritise the public's awareness of and resilience to misinformation and disinformation online. This includes enabling users to establish the reliability, accuracy, and authenticity of content.

Finally, the threat to democracy from artificial intelligence was discussed at the AI Safety Summit in November 2023, reinforcing the Government’s commitment to international collaboration on this shared challenge.


Written Question
Artificial Intelligence: Disinformation
Tuesday 6th February 2024

Asked by: Andrew Rosindell (Conservative - Romford)

Question to the Home Office:

To ask the Secretary of State for the Home Department, how much his Department has spent from the public purse on tackling AI deepfake crimes in each of the last three years.

Answered by Tom Tugendhat - Minister of State (Home Office) (Security)

Generative artificial intelligence services have made it easier to produce convincing deepfake content and, whilst there are legitimate use cases this is also impacting a range of crime types.

The Home Office is working closely with law enforcement, international partners, industry and across Government to address the risks associated with deepfakes. This includes reviewing the extent to which existing criminal law provides coverage of AI-enabled offending and harmful behaviour, including the production and distribution of deepfake material using generative AI. If the review suggests alterations to the criminal law are required to clarify its application to AI-generated synthetic and manipulated material then amendments will be considered in the usual way.

The Online Safety Act places new requirements on social media platforms to swiftly remove illegal content - including artificial intelligence-generated deepfakes - as soon as they become aware of it. The Act also updates Ofcom’s statutory media literacy duty to require it to take tangible steps to prioritise the public's awareness of and resilience to misinformation and disinformation online. This includes enabling users to establish the reliability, accuracy, and authenticity of content.

We have no current plans to ban services which generate deepfakes, however Government has been clear that companies providing AI services should take steps to ensure safety and reduce the risks of misuse. This was discussed at the Government’s AI Safety Summit in November 2023, reinforcing our commitment to international collaboration on this shared challenge.

Crime is recorded on the basis of the underlying offence, not whether a deepfake was involved, and we are therefore unable to provide a figure for deepfake-enabled crimes.

We are unable to provide figures for departmental spending as this is captured according to crime type, or broader work on artificial intelligence, and not broken down into activities specific to deepfakes.


Written Question
Artificial Intelligence: Disinformation
Tuesday 6th February 2024

Asked by: Andrew Rosindell (Conservative - Romford)

Question to the Home Office:

To ask the Secretary of State for the Home Department, how many potential crimes involving AI deepfake programmes were reported in each of the last three years.

Answered by Tom Tugendhat - Minister of State (Home Office) (Security)

Generative artificial intelligence services have made it easier to produce convincing deepfake content and, whilst there are legitimate use cases this is also impacting a range of crime types.

The Home Office is working closely with law enforcement, international partners, industry and across Government to address the risks associated with deepfakes. This includes reviewing the extent to which existing criminal law provides coverage of AI-enabled offending and harmful behaviour, including the production and distribution of deepfake material using generative AI. If the review suggests alterations to the criminal law are required to clarify its application to AI-generated synthetic and manipulated material then amendments will be considered in the usual way.

The Online Safety Act places new requirements on social media platforms to swiftly remove illegal content - including artificial intelligence-generated deepfakes - as soon as they become aware of it. The Act also updates Ofcom’s statutory media literacy duty to require it to take tangible steps to prioritise the public's awareness of and resilience to misinformation and disinformation online. This includes enabling users to establish the reliability, accuracy, and authenticity of content.

We have no current plans to ban services which generate deepfakes, however Government has been clear that companies providing AI services should take steps to ensure safety and reduce the risks of misuse. This was discussed at the Government’s AI Safety Summit in November 2023, reinforcing our commitment to international collaboration on this shared challenge.

Crime is recorded on the basis of the underlying offence, not whether a deepfake was involved, and we are therefore unable to provide a figure for deepfake-enabled crimes.

We are unable to provide figures for departmental spending as this is captured according to crime type, or broader work on artificial intelligence, and not broken down into activities specific to deepfakes.


Written Question
Artificial Intelligence: Disinformation
Tuesday 6th February 2024

Asked by: Andrew Rosindell (Conservative - Romford)

Question to the Home Office:

To ask the Secretary of State for the Home Department, whether his Department is taking steps to help tackle the rise in artificial intelligence generated deepfake crime.

Answered by Tom Tugendhat - Minister of State (Home Office) (Security)

Generative artificial intelligence services have made it easier to produce convincing deepfake content and, whilst there are legitimate use cases this is also impacting a range of crime types.

The Home Office is working closely with law enforcement, international partners, industry and across Government to address the risks associated with deepfakes. This includes reviewing the extent to which existing criminal law provides coverage of AI-enabled offending and harmful behaviour, including the production and distribution of deepfake material using generative AI. If the review suggests alterations to the criminal law are required to clarify its application to AI-generated synthetic and manipulated material then amendments will be considered in the usual way.

The Online Safety Act places new requirements on social media platforms to swiftly remove illegal content - including artificial intelligence-generated deepfakes - as soon as they become aware of it. The Act also updates Ofcom’s statutory media literacy duty to require it to take tangible steps to prioritise the public's awareness of and resilience to misinformation and disinformation online. This includes enabling users to establish the reliability, accuracy, and authenticity of content.

We have no current plans to ban services which generate deepfakes, however Government has been clear that companies providing AI services should take steps to ensure safety and reduce the risks of misuse. This was discussed at the Government’s AI Safety Summit in November 2023, reinforcing our commitment to international collaboration on this shared challenge.

Crime is recorded on the basis of the underlying offence, not whether a deepfake was involved, and we are therefore unable to provide a figure for deepfake-enabled crimes.

We are unable to provide figures for departmental spending as this is captured according to crime type, or broader work on artificial intelligence, and not broken down into activities specific to deepfakes.