All 1 Debates between Wera Hobhouse and Lauren Sullivan

Thu 19th Mar 2026

Online Harms

Debate between Wera Hobhouse and Lauren Sullivan
Thursday 19th March 2026

(1 day, 11 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lauren Sullivan Portrait Dr Lauren Sullivan (Gravesham) (Lab)
- View Speech - Hansard - - - Excerpts

I thank the hon. Member for St Neots and Mid Bedfordshire for securing this important debate.

Online harms are systemic, they are scaled, and they are producing real-world consequences, as we have seen. Social media is now the environment in which young people grow up—it is almost universal when children enter secondary school. According to a consultation by the Department for Science, Innovation and Technology, 81% of 10 to 12-year-olds are on social media, and 86% have accounts. The Youth Select Committee also did a study on youth violence and social media back in 2024, and found that 97% of 13 to 17-year-olds were online and that 70% of them see real-world violence online. That is a huge number of statistics, but they demonstrate the fact that social media is now in every young person’s bedroom, in their hand and in their pocket.

Professor Sarah-Jayne Blakemore from the University of Cambridge told me that adolescent brains are highly sensitive to the social environment, and the social media companies are probably aware of this. Adolescents’ brains have heightened neuroplasticity, and this will continue until their mid or late 20s. During adolescence, young people are trying to find identity and belonging, and I fear that the tech companies are exploiting this.

Where can we see evidence of harm? The National Education Union did a study called “Big Tech’s Little Victims”, in which researchers created fictional accounts and spent half an hour each day on Instagram, TikTok, Snapchat and YouTube. They found that harmful content appeared within three minutes, and often immediately. Young people in my constituency say, “I do not want to see this harmful content anymore,” yet they are still shown it, so what is going on?

The hon. Member for St Neots and Mid Bedfordshire mentioned the “Inside the Rage Machine” documentary, which I have seen a number of times. I am absolutely horrified at what the whistleblowers have revealed.

Wera Hobhouse Portrait Wera Hobhouse
- Hansard - -

The hon. Lady is making a very powerful speech about how young people, whose brains are still being formed, are being bombarded with online content. May I just let her know that my hon. Friend is actually the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom)? When she mentions him again, she might correct that.

Lauren Sullivan Portrait Dr Sullivan
- Hansard - - - Excerpts

My apologies to the hon. Member for St Neots and Mid Cambridgeshire (Ian Sollom).

I was speaking about “Inside the Rage Machine”. What people have witnessed is remarkable. The documentary makers found that serious exploitation cases were not being prioritised by TikTok, and that algorithms were repeatedly pushing harmful content.

It is not as simple as saying that we must ban children from social media; we need a suite of measures. The core issue is that young people, who are forming their identities, are vulnerable. Addictive algorithms are designed to maximise time and engagement, and they prioritise provocation instead of the truth. Louis Theroux’s Netflix documentary on the manosphere is an incredibly powerful and timely contribution to the debate, and he shows us that the online world is like a gold rush in the wild west. The approach of “hook, identity, monetise” drives profits, with streaming platforms like YouTube rewarding people who spout abominable things. There is a business model behind this, and I think we are all very much aware that we need to do something about it.

Harmful content spreads across platforms, so we need to be very clear about any ban on social media. Last week, the Science, Innovation and Technology Committee looked at the ban in Australia. We learned that because Australia defined which social media companies were to be included, other companies took their place. We can learn from that and it can feed into the Government’s consultation. We have to make the legislation stronger. Bans have limits, because they can be bypassed, as we see in Australia. They also shift the responsibility to the user. Why can we not shift the responsibility to the companies? We should not be banning children from social media; we should be banning the companies from exploiting our children.