Online Safety Act 2023: Repeal

Debate between Jim McMahon and Victoria Collins
Monday 15th December 2025

(4 days, 21 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Sir John. I congratulate the hon. Member for Sunderland Central (Lewis Atkinson), who made a very eloquent opening speech, and Members from across the Chamber, who have touched on really important matters.

As the hon. Member mentioned, the online space gives us great opportunities for connection and knowledge gathering, but also opportunities for greater harms. What has come across today is that we have addictive algorithms that are pushed in furtherance of commercial and malevolent interests—security interests, for example, although not the security of Great Britain—with no regard for the harm or impact they have on individuals or society.

When it comes to the Online Safety Act, we must get the balance right. Its protections for children and the vulnerable are vital. Of course, it is important to maintain freedom of speech and access to information. The Act is a step in the right direction in protecting children from extreme content, and we have seen changes in pornographic content. However, there are areas where it has not gone far enough, and it is not ready for the changes that are coming at a fast pace. There are websites that serve a public good that are age-gated, and forums for hobbies and communities that are being blocked. As the Liberal Democrats have said, we have to get the balance right. We also have to look at introducing something like a digital Bill of Rights with agile standards in the face of fast-paced changes, to embed safety by design at the base.

The harms that we need to protect children and vulnerable people from online are real. The contributions to this debate from hon. Members from across the House have been, as always, eye-opening and a reminder of how important this issue is. On pornographic content, we heard from the hon. Members for Morecambe and Lunesdale (Lizzi Collinge) and for Milton Keynes Central (Emily Darlington) sickening reminders of the horrific content online that young people see—and not by choice. We must never forget that, as has also been said, people are often not seeking this content, but it comes through, whether on X, which was Twitter, or other platforms. The Molly Rose Foundation highlighted that

“children using TikTok and X were more than twice as likely to have encountered…high risk content compared to users of other platforms.”

The online world coming to life has been mentioned in this debate. One of my constituents in Harpenden wrote to me, horrified that her daughter had been strangled on a dancefloor, because it showed how violent, graphic content is becoming normalised. That struck me to my core. Other content that has been mentioned: suicidal content, violent content and eating disorder misinformation, which the hon. Member for Worcester (Tom Collins) talked so eloquently about. The Molly Rose Foundation also highlighted that one in 10 harmful videos on TikTok have been viewed more than 1 million times, so we have young people seeing that ex content.

Even beyond extreme content, we are starting to see the addictive nature of social media, and the insidious way that this short-form content is becoming such a normalised part of many of our lives. Recent polling by the Liberal Democrats revealed that 80% of parents reported negative behaviours in their child due to excess phone usage, including skipping meals, having difficulty sleeping, or reporting physical discomforts such as eye strain or headaches. Parents and teachers know the real harms that are coming through, but young people themselves do too. I carried out a safer screens tour in my constituency in which I spoke to young people. Many of them said that they are seeing extreme content that they do not want to see, and that, although they have blocked the content, it comes back. The Online Safety Act is helping to change that, but it has not gone far enough. The addictive element of social media is important. In our surveys, two quotes from young people stood out. One sixth-former said that social media is

“as addictive as a drug”,

and that they felt its negative effects every day. Another young person simply wrote, “Help, I can’t stop.” Young people are asking for help and protection; we need to hold social media giants and online spaces to account.

It is welcome that some of those harms have been tackled by the Online Safety Act. On pornography, Pornhub has seen a 77% reduction in visitors to its website; Ofcom has launched 76 investigations into pornography providers and issued one fine of £50,000 for failing to introduce age checks, but we need to ask whether that goes far enough. It has come across loud and clear in this debate that the Online Safety Act has not gone far enough. Analysis has shown that Instagram and TikTok have started to introduce new design features that comply with the Online Safety Act, but game the system to still put forward content that is in those companies’ commercial interests, and not in the interests of young people.

Other extremely important harms include the new harms from AI. Many more people are turning to AI for mental health support. Generative AI is creating graphic content, and the Internet Watch Foundation found that

“reports of AI-generated child sexual abuse material have more than doubled in the past year”

and the IWF says it is at the point where it cannot tell the difference any more—it is horrific.

Jim McMahon Portrait Jim McMahon
- Hansard - -

The hon. Lady is making a very important point. It really concerns me to see just how desensitised young people or adults can become when they see that type of content, and that inhumane content is directly linked to misogyny and racism. While I know no Member of this House would say such a thing, outside this place I could imagine an argument being made that harm depicted in AI-generated content is not real harm, because the content in itself is not real and no real abuse has been carried out. However, does the hon. Lady share my concern that such content is incredibly harmful, and that there is a real danger that it entraps even more people down the very dark route to what is essentially child abuse and to further types of harm, which will then present in the real world in a way that I do not think even Parliament has yet registered? In a sense, this problem is becoming more and more of a public health crisis.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Absolutely. The insidious part of this issue is the normalisation of such harmful content. In a debate on Lords amendments to the then Data (Use and Access) Bill, on creatives and AI, I mentioned the fact that, in the two weeks since the previous vote, we had seen the release of Google Veo 3—the all-singing, all-dancing video creation software. We are moving so quickly that we do not see how good AI-generated content is becoming. Some content that we see online is probably AI-generated, but we do not realise it. On top of that, as the hon. Gentleman said, AI normalises extreme content and produces content that people think is real, but is not. That is very dangerous for society.

My next point concerns deepfakes, which are undermining trust. Some deepfakes are obvious; some Members of Parliament and news presenters have been targeted through deepfakes. Just as important, however, is the fact that much deepfake content seems normal, but is undermining trust in what we see—we do not know what is real and what is not any more. That is going to be very dangerous not only in terms of extreme content, but for our democracy, and that argument has been made by other Members in this debate.

It is also worrying that social media platforms do not seem to see that problem. To produce its risk assessment report, Ofcom analysed 104 platforms and asked them to put in submissions: not a single social media platform classified itself as high risk for suicide, eating disorder or depression—yet much of what we have heard during this debate, including statistics and anecdotal stories, shows that that is just not true.

On the other hand, while there are areas where the Online Safety Act has not gone far enough, in other areas it has overstepped the mark. When the children’s code came into place, Lord Clement-Jones and I wrote to Secretary of State to outline some of our concerns, including political content being age-gated, educational sites such as Wikipedia being designated as category 1, and important forums about LGBTQ+ rights, sexual health or potentially sensitive topics being age-gated, despite being important for many who are learning about the world.

Jamie from Harpenden, a young person who relies on the internet heavily for education, found when he was looking for resources that a lot of them were flagged as threatening to children and blocked, and felt that that prevented his education. Age assurance systems also pose a problem to data protection and privacy. The intention behind this legislation was never to limit access to political or educational content, and it is important that we support access to the content that many rely on—but we must protect our children and vulnerable people online, and we must get that balance right.

I have a few questions for the Minister. Does he agree with the Liberal Democrats that we should have a cross-party Committee of both Houses of Parliament to review the Online Safety Act? Will he confirm what resources Ofcom has been given? Has analysis been conducted to ensure that Ofcom has enough resources to tackle these issues? What are the Government doing about AI labelling and watermarking? What are they doing to tackle deepfakes? Does the Minister agree that it is time to support the wellbeing of our children, rather than the pockets of big tech? Will the Minister support Liberal Democrat calls to increase the age of data consent and ban social media giants from collecting children’s data to power the addictive algorithms against them? We are calling for public health warnings on addictive social media for under-18s and for a doomscroll cap. Most important is a digital bill of rights and standards that, in light of the fast pace of change, need to be agile.

Our young people deserve better. We need to put children, young people and vulnerable people before the profits of big tech. We will not stop fighting until that change is made.