Online Safety Act 2023: Repeal

Ian Murray Excerpts
Monday 15th December 2025

(1 day, 20 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Ian Murray Portrait The Minister for Digital Government and Data (Ian Murray)
- Hansard - -

It is great to see you in the Chair, Sir John. I did not realise you were such a technophobe until we heard from the shadow Minister, the hon. Member for Hornchurch and Upminster (Julia Lopez). I am disappointed that you were not able to contribute to this debate. I thank my hon. Friend the Member for Sunderland Central (Lewis Atkinson) for moving the motion on behalf of the Petitions Committee, and I thank him and other speakers for their contributions.

I have not been on the RTG fans message board that my hon. Friend mentioned, but I am sure it has been very busy this weekend. I wondered if some of the trolls mentioned by the hon. Member for Bromley and Biggin Hill (Peter Fortune) were perhaps wearing black and white over the weekend. My hon. Friend the Member for Sunderland Central raised an important point, however: it is the site managers and volunteers who are hosting those forums, keeping them legitimate and working very hard to abide by the law.

Jambos Kickback is an important site for my football team, and many people use it to find out what is going on. It is run by volunteers with no money at all—just for the sheer love of being on the forum together—so I fully understand what the petitioner wants to bring forward. I thank my hon. Friend for the measured way in which he put forward the e-petition. He called for robust, effective and proportionate regulation, which is what the Government are trying to do through the Online Safety Act.

The shadow Minister highlighted that by going through the ledger of the positive and negative issues that the Government face, and indeed that were faced when her party was in government. The one thing on that ledger that is non-negotiable is the safety of children online—I think all hon. Members made that point; in fact, I am disappointed that those who do not make that point are not in this debate to try to win that argument, because I would be very interested to hear what they have to say.

The petition received over 550,000 signatures. Although I appreciate the concerns that it raised, I must reiterate the Government’s very strong response that we have no plans to repeal the Online Safety Act. Parents should know and be confident that their children—I am a father of two young girls, aged five years and ten months—are safe when they access popular online services and that they can benefit from the opportunities that the online world offers. That is why the Government are working closely with Ofcom to implement the Act as quickly and as effectively as possible to enable UK users to benefit from the Act’s protections.

This year, 2025, has been one of significant action on online safety. On 17 March the illegal harms codes of practice came into effect. Those codes will drive significant improvements in online safety in several areas. Services are now required to put in place measures to reduce the risk of their services facilitating illegal content and activity, including terrorism, child sexual abuse and exploitation, and other kinds of illegal activity.

I asked the officials for a list of the priority offences in the Act; there were 17, but that number has increased to 20, with the new Secretary of State at the Department adding some others. It is worth reading through them because it shows the problem and the scale of it. I was really struck by Members who talked about the real world and the online world: if any of these offences were happening in the real world, someone would be carted off to jail immediately rather than being allowed to continue to operate, as they do online.

The priority offences are assisted suicide; threats to kill; public order offences such as harassment, stalking and fear of provocation of violence; drugs and psychoactive substances; firearms and other weapons; assisted illegal immigration; human trafficking; sexual exploitation; sexual images; intimate images of children; proceeds of crime; fraud; financial services fraud; foreign interference; animal welfare; terrorism; and controlling or coercive behaviour. The new ones that have been added by the Secretary of State include self-harm, cyber-flashing and strangulation porn. Do we honestly have to write that into a schedule of an Online Safety Act to say that those things are unacceptable and should not be happening on our computers?

On 25 July, the child safety regime came into force. Services now use highly effective age assurance to prevent children in the UK from encountering pornography and content that encourages, promotes and provides instructions for self-harm, suicide or eating disorders. Platforms are also now legally required to put in place measures to protect children from other types of harmful content, including abusive or hateful content, or bullying and violent content.

When we visited schools, we spoke to headteachers, teachers and parents about the real problem that schools have in trying to deal with the bullying effects of social media. According to Ofcom’s 4 December report that some hon. Members have referenced already, many services now deploy age checks, including the top 10 most popular pornographic sites, the UK’s most popular dating apps and a wide range of other services, including X, Telegram, Reddit, TikTok, Bluesky, Discord, Xbox and Steam. This represents a safer online experience for millions of children across the UK; we have heard that it is already having an impact.

The Government recognise, however, the importance of implementing the duties proportionately. That is why proportionality is a core principle of the Act and is built into many of the duties contained within it. Ofcom’s illegal content and child safety codes of practice set out recommended measures that are tailored to both size and risk to help providers to comply with their obligations —it is really important to emphasise that. When recommending steps that providers can take to comply with their duties, Ofcom must consider the size and risk level of different types and kinds of services.

Let me just concentrate on that for a minute. For instance, Ofcom recommends user blocking and muting measures to help to protect children from harmful content, including bullying, violent content and other harmful materials, and those recommendations are tailored to services’ size and risk profile. Specifically, Ofcom recommends that all services that are high risk for this content need to implement those measures in full. However, for services that are medium risk for this content, Ofcom suggests that they need to implement the measures only if they have more than 700,000 users.

However, while many services carry low risks of harm, risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government are very concerned about small platforms that host the most harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting all small services from duties requiring them to tackle that type of content would mean that those forums would not be subject to the Act’s enforcement powers, which is why we reject the petitioner’s views. Even forums that might seem harmless carry potential risks, such as where adults can engage directly with child users.

The Government recognise the importance of ensuring that low-risk services do not have unnecessary regulatory burdens placed upon them, which I hope reassures the shadow Minister. That is why, in the statement of strategic priorities issued on 2 July, the Government set out our expectation that Ofcom should continue focusing its efforts on safety improvements among services that pose the highest risk of harm to users, including small but risky services. The Government also made it explicitly clear that Ofcom should ensure that expectations on low-risk services are proportionate.

Alongside proportionate implementation of the Act, the Government also understand the need to communicate the new regulations effectively, and to work with companies within its scope to ensure that compliance is as easy as possible. To deliver that, Ofcom is providing support to online service providers of all sizes to make it easier for them to understand and comply with their responsibilities under the UK’s new online safety laws. For example, Ofcom has already launched a regulation checker to help firms to check whether they are covered by the new rules, as well as a number of quick guides for them.

I will address some of the issues raised by Members. My right hon. Friend the Member for Oxford East (Anneliese Dodds) started by raising the issue of pornography and other harmful content. User-to-user services that allow pornographic content, and content that promotes, provides instructions for or encourages suicide, self-harm or eating disorders, must use highly effective age assurance to prevent all children under 18 from accessing that type of content.

Services must take proportionate steps to minimise the risk of children encountering that type of content when using them, and they must also put in place age assurance measures to protect children from harmful content, such as bullying and violent content. Ofcom’s “Protection of Children Codes of Practice” set out what steps services can take to comply, and Ofcom has robust enforcement powers available to use against companies that fail to fulfil those important duties. We are already seeing that enforcement happening, with 6,000 sites having taken action to stop children from seeing harmful content, primarily via age checks. That shows the scale of the issue.

Virtual private networks have also been mentioned by a number of Members, including the shadow Minister. Following the introduction of the child safety duties in July, Ofcom reported that UK daily active users of VPN apps temporarily doubled to around 1.5 million—the average is normally about 750,000. Since then, usage has dropped, falling back down to around 1 million daily users by the end of September. That was expected, and it has also happened in other jurisdictions that have introduced age checks. According to an Ofcom rule, services should

“take appropriate steps to mitigate against methods of circumvention that are easily accessible to children”.

If a provider is not complying with the age assurance duties, by promoting VPN usage to bypass age assurance methods, Ofcom can and should take enforcement action. The use of VPNs does not protect platforms from not complying with the Act itself.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

The Minister has done a huge amount of work on this issue, which I am sure is appreciated by everyone in this House. It cannot be beyond the wit of man to find a way for these VPN companies to bridge between the service user and the ultimate website or platform that they are viewing, so why are VPNs not in scope of the legislation to ensure that they are compliant with the age verification measures? Presumably, it is more difficult for the end website to know the origins of the user, if they have bypassed via a VPN. Surely the onus should be on the VPN company to comply with the law also.

Ian Murray Portrait Ian Murray
- Hansard - -

My hon. Friend makes a good point; let me come back to him in detail on the VPN issue, as his question relates to what we are planning to do in our review of the Online Safety Act, including both what was written into the legislation and what was not.

My hon. Friend the Member for Darlington (Lola McEvoy), who is no longer in her place, highlighted the really important issue of chatbots, which has also been mentioned by a number of other Members. Generative AI services including chatbots that allow users to share content with one another or search live websites to provide search engines are already regulated under the Online Safety Act. Those services must protect users from illegal content and children from harmful and age-inappropriate content.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Ofcom has said, and my understanding is, that in certain circumstances AI chatbots are covered, but certain new harms—such as emotional dependence—are not. That is an area where the House and many people are asking for clarity.

Ian Murray Portrait Ian Murray
- Hansard - -

I do not disagree with the hon. Lady. There are a whole host of issues around porn bots and AI-generated bots that have now also sprung up. We know that we are committed to the Online Safety Act and its review as its being implemented. As technology moves on quickly, we have to keep pace with what the harms are and how we are able to deal with them. I thank the hon. Lady for raising those particular issues.

We will act on the evidence that comes forward. It is clear that if the evidence shows us that we have to act in various areas, including chatbots, we will do so. The Secretary of State announced plans to support a child safety summit in 2026, which will bring together tech companies, civil society and young people to shape how AI can benefit children and look at online harms and the movements on those.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I wanted to raise with the Minister that the Science, Innovation and Technology Committee will be undertaking an inquiry in the new year on brain development, addictive use and how that impacts various key points in children’s development. The Minister says that he will look at all evidence. Will he look at the evidence produced by that inquiry to ensure that its information and advice goes to parents across this country?

--- Later in debate ---
Ian Murray Portrait Ian Murray
- Hansard - -

I thank my hon. Friend for the work that she does on that Committee. Of course, the Government have to respond in detail to such reports and we look forward to the recommendations it brings forward. Often we see conspiracy theories in the online world, but there is no conspiracy theory here: the Government are not trying to defend a position against what evidence might come forward.

We have just signed a memorandum of understanding with Australia to look at their experiences of protecting children online and whether there are things that we can do in this country. It has to be evidence-based, and if the evidence base is there, we will certainly make sure to act, because it is non-negotiable that we protect young people and children online.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

I think there is no disagreement on the protection of children and there is no disagreement on what we have legislated to be illegal content. There is more debate needed on harmful but not illegal content and where that line is and what we enforce, and the protections for those who are not children, particularly vulnerable users and those who are being exploited and drawn into some quite extreme behaviours.

I will be honest about where some of these tensions are. How confident will the UK Government be in entering into negotiations on this when we are in the position we are in on trade with the US? The US has also made it clear that it sees any further regulation on social media platforms to be an infringement on trade and freedom of speech. When it comes to making that call, where will the UK Government be?

Ian Murray Portrait Ian Murray
- Hansard - -

My hon. Friend makes an important point, because freedom of expression is guaranteed in the Act. Although we are regulating to make sure that children and young people are protected online, he is right to suggest that that does not mean we are censoring stuff for adult content. The internet is a place where people can access content if they are age-verified to do so, but it cannot be illegal content. The list of issues in schedule 7 to the Act that I read out at the start of my speech is pretty clear on what someone is not allowed to do online, so any illegal content online still remains illegal. We need to work clearly with the online platforms to make sure that that is not being purveyed through them.

We have seen strong examples of this issue in recent months. If we reflect back to Southport, the public turned to local newspapers—we have discussed this many times before—because they wanted fast and regular but trustworthy news. They turned away from social media channels to get the proper story, and they knew they could trust the local newspaper that they were able to pick up and read. I think the public have a very strong understanding of where we are, but I take the point about people who are not as tech-savvy or are impaired in some way, and so may need further protections. My hon. Friend makes the argument very strongly.

I want to turn to AI chatbots, because they were mentioned in terms of mental health. We are clear that AI must not replace trained professionals. The Government’s 10-year health plan lays foundations for a digital front door for mental health care. Last month, the Secretary of State for Science, Innovation and Technology urged Ofcom to use existing powers to protect children from the potential harms of AI chatbots. She is clear that she is considering what more needs to be done. The Department of Health and Social Care is looking at mental health through the 10-year plan, but the Secretary of State for Science, Innovation and Technology has also been clear that she will not allow AI chatbots to affect young people’s mental health, and will address their development, as mentioned by the Liberal Democrat spokesperson, the hon. Member for Harpenden and Berkhamsted (Victoria Collins).

Let me touch on freedom of expression, because it is important to balance that out. It is on the other side of the shadow Minister’s ledger, and rightly so, because safeguards to protect freedom of expression and privacy are built in throughout the Online Safety Act. Services must consider how to protect users’ rights when applying safety measures, including users’ rights to express themselves freely. Providers do not need to take action on content that is beneficial to children—only against content that poses a risk of harm to children on their services. The Act does not prevent adults from seeking out legal content, and does not restrict people posting legal content that others of opposing views may find offensive. There is no removing of freedom of speech. It is a cornerstone of this Government, and under the Act, platforms have duties to protect freedom of speech. It is written into legislation.

Let me reiterate: the Online Safety Act does not limit freedom of speech. In fact, it protects it. My hon. Friend the Member for Worcester (Tom Collins) was clear when he said in his wonderful speech that making the internet a safe space promotes freedom of speech. Indeed it does, because it allows us to have the confidence that we can use online social media platforms, trust what we are reading and seeing, and know that our children are exposed to age-appropriate content.

I will address age assurance, which was mentioned by the hon. Member for Dewsbury and Batley (Iqbal Mohamed). Ofcom is required to produce a report on the use of age assurance technologies, including the effectiveness of age assurance, due in July 2026—so in seven months’ time. That allows sufficient time for these measures to embed in before considering further action, but the Government continue to monitor the impact of circumvention techniques such as VPNs and the effectiveness of the Act in protecting children. We will not hesitate to go further if necessary, but we are due that report in July 2026, which will be 12 months from the implementation of the measures.

The Liberal Democrat spokesperson asked about reviewing the Act. My previous comments covered some of that, but it is critical that we understand how effective the online safety regime is, and monitoring and evaluating that is key. My Department, Ofcom and the Home Office have developed a framework to monitor the implementation of the Act and evaluate the core outcomes from it.

Tom Collins Portrait Tom Collins
- Hansard - - - Excerpts

The Minister describes the review of the Act and how we have a rapidly growing list of potential harms. It strikes me that we are up against a very agile and rapidly developing world. I recently visited the BBC Blue Room and saw the leading edge of consumer-available technology, and it was quite disturbing to see the capabilities that are coming online soon. In the review of the Act, is there scope to move from a register of harms into perhaps domains of safety, such as trauma, addiction or attachment, where the obligation would be on service providers or manufacturers to ensure their products were safe across those domains? Once again, there could be security for smaller businesses available from the world of technical standards, where if a business is offering a simple service and meets an industry-developed standard, they have presumption of compliance. The British Standards Institution has demonstrated very rapid development of that through the publicly available specification system, and that is available to help us to navigate this rapidly. Could that be in scope?

John Hayes Portrait Sir John Hayes (in the Chair)
- Hansard - - - Excerpts

Interventions should be brief, but I am very kind.

Ian Murray Portrait Ian Murray
- Hansard - -

Sir John, you are indeed very kind. My hon. Friend gave two examples during his speech. First, he mentioned brakes that were available only for high-end and expensive cars, and are now on all cars. Secondly, he mentioned building regulations, and how we would not build a balcony without a barrier. Those examples seem fairly obvious and almost flippant, but it seems strange that we would regulate heavily to make sure that people are safe physically—nobody would ever argue that it would be a complete disregard of people’s freedom to have a barrier on an 18th-floor balcony—but not online. We do that to keep people safe, and particularly to keep children safe. As my hon. Friend said, if we are keeping adults safe, we are ultimately keeping children safe too.

We have to continue to monitor and evaluate. I was just about to come on to the post-implementation review of the Act, which I am sure my hon. Friend will be very keen to have an input into. The Secretary of State must complete a review of the online safety regime two to five years after part 3 of the Act, which is about duties of care, fully comes into force. The review will therefore be completed no sooner than 2029. These are long timescales, of course, and technology is moving, so I understand the point that he is making. I recall that in the Parliament from 2010 to 2015, we regulated for the telephone, so we move slowly, although we understand that we also have to be nimble to legislate.

The Lib Dem spokesperson, the hon. Member for Harpenden and Berkhamsted, asked whether the Act has gone far enough. Ofcom, the regulator, is taking an iterative approach and will strengthen codes of practice as online harms, technology and the evidence evolve. We are already making improvements, for example strengthening the law to tackle self-harm, cyber-flashing and strangulation. The hon. Lady also asked whether Ofcom has received an increase in resources. It has—Ofcom spending has increased by nearly 30% in the past year, in recognition of its increased responsibilities. She also asked about a digital age of consent. As I mentioned, we have signed a memorandum of understanding with Australia and will engage with Australia to understand its approach. Any action will be based, of course, on robust evidence.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

I would just like to clarify that I made a call for an age of data consent. We put that forward earlier this year as an amendment to the Act. A very first step is to stop social media companies harvesting data and using it to power these addictive algorithms against young people. It is about data consent to 16. Then of course, there is the wider discussion about what is happening with social media in general, but it is that age of data consent that is our first call to action.

--- Later in debate ---
Ian Murray Portrait Ian Murray
- Hansard - -

I take that point about the amendment that the Liberal Democrats tabled.

The hon. Lady also asked for a cross-party Committee to take action. I have already talked about the review of the implementation of the regulations that will happen in July and the other stages after that, as well as the post-implementation review. Of course, setting up a new Committee is a matter for the House. I have no objections to the House setting up Committees to look at these big and important issues that we all care about, if that is what it decides to do.

My hon. Friend the Member for Worcester talked about the issue of Parliament and engagement. He asked whether the Department would engage with the group of academics he mentioned, who are looking at technical safety standards for social media, including considering what role those academics could play in relation to these provisions. I welcome his invitation and I am sure that the Minister responsible for this area—the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Vale of Glamorgan (Kanishka Narayan)—would be delighted to participate in those talks. I am sure that he will be in touch with my hon. Friend the Member for Worcester to take him up on that offer.

We have heard about algorithms, so it is worth focusing concentrating on them. Hon. Friends have talked about the algorithms that serve harmful content. The Government have been clear that algorithms can impact on the risk of harm for children, which is why the legislation comprehensively covers them. The legislation requires providers to consider, via risk assessment, how algorithms could impact children’s exposure to illegal or harmful content, and providers must then take steps to mitigate those risks. If they do not do so, Ofcom has powers that it can use.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

There needs to be a tie-in here with the Cabinet Office and the review of electoral law. If a kind donor in my constituency owned a big billboard and gave me absolute free use of it during an election period, but made an offer to any other party that they could put a Post-it note on the back of it that nobody would see, I would have been expected to declare that as a gift in kind, or a donation in kind. That is not the case with algorithms that are posting and promoting generally right-wing and far-right content during the regulated period. Surely there has to be a better join-up here of election law and online law.

Ian Murray Portrait Ian Murray
- Hansard - -

This is a huge issue and all of us in this House are very concerned about misinformation and disinformation, and the impact on our democracy. Indeed, I am sure that in the time that I have been speaking here in Westminster Hall, my own social media will have been full of bots and all sorts of other things that try to encourage people to get involved in this debate, in order to influence the algorithm. That can fundamentally disturb our democracy, and is something we are looking at very closely. The Cabinet Office and ourselves are looking at the misinformation and disinformation issue, as is the Department for Culture, Media and Sport in terms of the media outlook and how elections are run in this country. We should all be very clear about not having our democratic processes undermined by such algorithmic platforms that serve up the kind of content that provides misinformation and disinformation to the public.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I appreciate what the Minister says—that these powers are in legislation—yet the process is still the social media platforms marking their own homework. We are in a vicious circle: Ofcom will not take action unless it has a complaint based on evidence, but the evidence is not achievable because the algorithm is not made available for scrutiny. How should Ofcom use those powers more clearly ahead of the elections to ensure that such abuse to our democracy does not occur?

Ian Murray Portrait Ian Murray
- Hansard - -

A whole host of legislation sits behind this, including through the Electoral Commission and the Online Safety Act, but it is important for us to find ways to ensure that we protect our democratic processes, whether that be from algorithmic serving of content or foreign state actors. It is in the public domain that, when the Iranian servers went dark during the conflict with the US, a third of pro-independence Facebook pages in Scotland went dark, because they were being served by foreign state actors. We have seen that from Russia and various other foreign actors. We have to be clear that the regulations in place need to be implemented and, if they are not, we need to find other ways to ensure that we protect our democracy. At a small tangent, our public sector broadcasters and media companies are a key part of that.

To stay with my hon. Friend the Member for Milton Keynes Central (Emily Darlington), she made an excellent contribution, with figures for what is happening. She asked about end-to-end encryption. We support responsible use of encryption, which is a vital part of our digital world, but the Online Safety Act does not ban any service design such as end-to-end encryption, nor does it require the creation of back doors. However, the implementation of end-to-end encryption in a way that intentionally binds tech companies to content will have a disastrous impact on public safety, in particular for children, and we expect services to think carefully about their design choices and to make the services safe by design for children.

That leads me to online gaming platforms and Roblox, which my hon. Friend also mentioned. Ofcom has asked the main platforms, including Roblox, to share what they are doing and to make improvements where needed. Ofcom will take action if that is not advanced. A whole host of things are happening, and we need the Online Safety Act and the regulations underpinning it to take time to feed through. I hope that we will start to see significant improvements, as reflected on by my hon. Friend the Member for Sunderland Central.

My hon. Friend the Member for Milton Keynes Central mentioned deepfakes. That issue is important to our democracy as well. The Government are concerned about the proliferation of AI-enabled products and services that enable deepfake non-consensual images. In addition to criminalising the creation of non-consensual images, the Government are looking at further options, and we hope to provide an update on that shortly. It is key to protecting not only our wider public online but, fundamentally, those who seek public office.

The Government agree that a safer digital future needs to include small, personally owned and maintained websites. We recognise the importance that proportionate implementation of the Online Safety Act plays in supporting that aim. We can all agree that we need to protect children online, and we would not want low-risk services to have any unnecessary compliance burden. That is a balance that we have to strike to make it proportionate. The Government will conduct a post-implementation review of the Act and will consider the burdens on low-risk services as part of that review, as mentioned in the petition. We will also ensure that the Online Safety Act protects children and is nimble enough to deal with a very fast-moving tech world. I thank all hon. Members for providing a constructive debate and raising their issues. I look forward to engaging further in the months and years ahead.