Online Safety Act 2023: Repeal

Debate between Ian Murray and Emily Darlington
Monday 15th December 2025

(1 day, 12 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Ian Murray Portrait Ian Murray
- Hansard - -

I do not disagree with the hon. Lady. There are a whole host of issues around porn bots and AI-generated bots that have now also sprung up. We know that we are committed to the Online Safety Act and its review as its being implemented. As technology moves on quickly, we have to keep pace with what the harms are and how we are able to deal with them. I thank the hon. Lady for raising those particular issues.

We will act on the evidence that comes forward. It is clear that if the evidence shows us that we have to act in various areas, including chatbots, we will do so. The Secretary of State announced plans to support a child safety summit in 2026, which will bring together tech companies, civil society and young people to shape how AI can benefit children and look at online harms and the movements on those.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I wanted to raise with the Minister that the Science, Innovation and Technology Committee will be undertaking an inquiry in the new year on brain development, addictive use and how that impacts various key points in children’s development. The Minister says that he will look at all evidence. Will he look at the evidence produced by that inquiry to ensure that its information and advice goes to parents across this country?

--- Later in debate ---
Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I appreciate what the Minister says—that these powers are in legislation—yet the process is still the social media platforms marking their own homework. We are in a vicious circle: Ofcom will not take action unless it has a complaint based on evidence, but the evidence is not achievable because the algorithm is not made available for scrutiny. How should Ofcom use those powers more clearly ahead of the elections to ensure that such abuse to our democracy does not occur?

Ian Murray Portrait Ian Murray
- Hansard - -

A whole host of legislation sits behind this, including through the Electoral Commission and the Online Safety Act, but it is important for us to find ways to ensure that we protect our democratic processes, whether that be from algorithmic serving of content or foreign state actors. It is in the public domain that, when the Iranian servers went dark during the conflict with the US, a third of pro-independence Facebook pages in Scotland went dark, because they were being served by foreign state actors. We have seen that from Russia and various other foreign actors. We have to be clear that the regulations in place need to be implemented and, if they are not, we need to find other ways to ensure that we protect our democracy. At a small tangent, our public sector broadcasters and media companies are a key part of that.

To stay with my hon. Friend the Member for Milton Keynes Central (Emily Darlington), she made an excellent contribution, with figures for what is happening. She asked about end-to-end encryption. We support responsible use of encryption, which is a vital part of our digital world, but the Online Safety Act does not ban any service design such as end-to-end encryption, nor does it require the creation of back doors. However, the implementation of end-to-end encryption in a way that intentionally binds tech companies to content will have a disastrous impact on public safety, in particular for children, and we expect services to think carefully about their design choices and to make the services safe by design for children.

That leads me to online gaming platforms and Roblox, which my hon. Friend also mentioned. Ofcom has asked the main platforms, including Roblox, to share what they are doing and to make improvements where needed. Ofcom will take action if that is not advanced. A whole host of things are happening, and we need the Online Safety Act and the regulations underpinning it to take time to feed through. I hope that we will start to see significant improvements, as reflected on by my hon. Friend the Member for Sunderland Central.

My hon. Friend the Member for Milton Keynes Central mentioned deepfakes. That issue is important to our democracy as well. The Government are concerned about the proliferation of AI-enabled products and services that enable deepfake non-consensual images. In addition to criminalising the creation of non-consensual images, the Government are looking at further options, and we hope to provide an update on that shortly. It is key to protecting not only our wider public online but, fundamentally, those who seek public office.

The Government agree that a safer digital future needs to include small, personally owned and maintained websites. We recognise the importance that proportionate implementation of the Online Safety Act plays in supporting that aim. We can all agree that we need to protect children online, and we would not want low-risk services to have any unnecessary compliance burden. That is a balance that we have to strike to make it proportionate. The Government will conduct a post-implementation review of the Act and will consider the burdens on low-risk services as part of that review, as mentioned in the petition. We will also ensure that the Online Safety Act protects children and is nimble enough to deal with a very fast-moving tech world. I thank all hon. Members for providing a constructive debate and raising their issues. I look forward to engaging further in the months and years ahead.