All 1 Debates between Emily Darlington and Matt Rodda

Online Harm: Child Protection

Debate between Emily Darlington and Matt Rodda
Tuesday 24th February 2026

(3 days, 22 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- View Speech - Hansard - -

This week is Eating Disorders Awareness Week, and we must remember the acceleration of online harms. We have heard horrific accounts of ChatGPT giving young people diets of 600 calories per day, which is just appalling. We know the suffering and pain caused by seeing images tagged with the terms “ana”, “thinspiration” and other terms that should go. The promotion of such content is now a category 1 offence, and Ofcom should be weeding it out. The hon. Member for Winchester (Dr Chambers) is absolutely right to say that that measure should be extended to bots.

I thank the Chair of the Science, Innovation and Technology Committee, my hon. Friend the Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah), for her fantastic speech. We have taken this matter seriously since the very beginning of the parliamentary Session, and we have done a lot of work on it. I echo her call for Ministers to look again at the recommendations in our Committee’s “Social media, misinformation and harmful algorithms” report, which goes well beyond misinformation and into how the damage is done.

Protecting our children and young people online is extremely important. The Online Safety Act was an important step forward, but it has not been fully implemented by Ofcom, it is not proactive enough, and it is too dependent on what social media companies themselves tell Ofcom. In the spirit of consultation—I know that we will get to that—I have done my own consultation with 500-plus 14 to 16-year-olds across my Milton Keynes Central constituency. Some 91% of them have a phone, and 80% have social media profiles. However, what will surprise the House is what young people consider social media profiles to be. We consider them to be Facebook or Instagram, while they consider them to be YouTube and Roblox—two organisations not covered by the Australian model. Additionally, 74% of those 14 to 16-year-olds spend two to seven hours online a day. Let me remind the House that, at that age, the brain development of young women is close to finished, while for young men, whose brain development does not finish until they are about 25, it is nowhere near complete. We know that from the science—just to be clear, that is not an opinion. Brain development in young women and girls happens differently, so should we therefore have different rules for young women and men?

Fifty-nine per cent of the 14 to 16-year-olds have been contacted by strangers, and more than a third of that was through Roblox, which is not covered by the Australian social media ban. Thirty-three per cent have been bullied, and a third of those was on Roblox. The Australian social media ban—which I assume is what the Liberal Democrats are talking about when they say they are in favour of a ban—does not cover YouTube or Roblox, and we have not even looked at whether it is effective. A ban is a blunt tool that essentially raises the flag of surrender to social media platforms and declares that there is no way of making social media safe. That is essentially what the Conservatives did when the Online Safety Act 2023 was passed: they said, “We cannot go far enough, so we are going to roll back. It is about free speech.” No, it is not about free speech. Freedom of speech was written into law in this country and spread around the world, so we understand how to protect it and limit its harm. The Online Safety Act was a missed opportunity. It also took seven years to get through this House, but we do not have seven years to wait.

There would also be unintended consequences to a ban. I had the pleasure of meeting Ian Russell the other night, and we had a really powerful discussion. My heart goes out to him, as one parent to another, given what his family have been through. He does not jump to the easy solution of a social media ban. The Molly Rose Foundation has done a brilliant briefing paper, which every MP should read, about why it does not support a ban: it wants the online world to be safe for children, but a ban does not make it so.

Matt Rodda Portrait Matt Rodda
- Hansard - - - Excerpts

My hon. Friend is making an excellent speech. I commend her work in reaching out to young people; it sounds superb. The lesson may be that we should all do exactly that. I am running a survey myself. She mentioned the Molly Rose Foundation, and I have met some of its staff to discuss its work. A family in my constituency of Reading suffered a terrible incident—their son was murdered in an incident of online bullying—and they have a different view. Does my hon. Friend agree that it is important that we properly listen to the families and consider the different views in the consultation?

Emily Darlington Portrait Emily Darlington
- Hansard - -

I absolutely do. My full sympathy goes to that family in my hon. Friend’s constituency—it is the worst thing in the world for a parent to lose a child. But we have to get this right, which is why it is right that we have a consultation. It does no child any good if we jump to a conclusion that does not actually protect children.

Although I maintain an open mind, I worry about a full ban. Some children rely on social media for connection, often including those who are exploring their sexuality—LGBTQ+ people—and those who are neurodivergent. The consequences for them could be devastating, so we need to consider their views. If young people get around the ban, as they do in Australia, they are less likely to report when they see harmful content or are being targeted on social media, because they worry that they will get in trouble for breaking the law.

A ban would create a cliff edge at 16. No matter the person’s maturity—I have already talked about the different brain development in young women and men—their skills or what they have been taught, there is a cut-off at 16. All of a sudden it does not matter, and they go into a world that is not safe. Younger children do not have their own social media profiles; they use their parents’ devices. Often, they start with a video of Peppa Pig, and all of a sudden—who knows where it ends up? A ban would not address that. So, what is the solution? Doing nothing is not an option—I think the whole House can agree on that.