All 1 Debates between Victoria Collins and Kirsty Blackman

Online Harm: Child Protection

Debate between Victoria Collins and Kirsty Blackman
Tuesday 24th February 2026

(1 week, 1 day ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- View Speech - Hansard - -

I have been quite shocked at some of the procedural discussion for several reasons. First, we are acting like this has just come up, but even in the House of Commons under this mandate, as my hon. Friend the Member for South Devon (Caroline Voaden) mentioned, the safer phones Bill was put forward in 2024. As Liberal Democrats, we put forward amendments to change the age of data consent to ban addictive algorithms. There have also been calls to act on doomscroll caps, and we have highlighted the harms of AI chatbots. Yet we are at a point—I absolutely respect what the hon. Member for Aberdeen North (Kirsty Blackman) was saying on this—where a consultation was proposed by the Government over a month ago, but we still do not know the details. There are things going through the House of Lords that, again, we do not know the details of. At the very least, Liberal Democrats are trying to give the space for that and say, “Yes, we need to start putting forward that legislation.” If there is another chance to debate that, what is the harm in this motion because this is such a crucial issue?

Secondly, it is not as if this is an issue that turned up yesterday. As the hon. Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah) talked about, these harms have been happening for years—over 22 years for Facebook. I will go on to say more about that in a moment. Other countries around the world are showing leadership on this and saying that we have to act now. My point is that at the very least, a consultation could have been launched earlier. This is not something new in this Parliament. We are saying that action needs to be taken.

Most importantly, the parents, children and experts watching this debate want to see us taking this issue seriously. Children and young people are at the heart of this. I think back to the first time I met some of the sixth-form students at Ashlyns school in Berkhamsted. I will never forget sitting around that table with one sixth-former—let’s call him James. He told me about his fears for the mental health of his friends. He warned about the self-harm that he was seeing among his peers, which his teachers were not even aware of, and he talked about the role of social media. A few weeks later, I was pulled to one side at St George’s school in Harpenden, where some young women shared with me their concerns about the growing misogyny lived out by young men, which started on social media.

Since then, I have carried out a “Safer Screens” tour meeting young people. Students have talked about brain rot and seeing the extreme content that the algorithm continues to push on them, even when they try to block it—the hon. Member for North West Cambridgeshire (Sam Carling) talked about that. One student said, “It is as addictive as a drug”, and they see the harms of it every day.

This is the tipping point, and I am surprised that many Members think that it is not. This is that moment. Parents, teachers, experts and even young people are crying out for action, and have been for a long time, to tackle the social media giants that have no care for their mental health. As I said, this tipping point has been years in the making. Facebook was launched 22 years ago. Indeed, a Netflix documentary from six years ago started to highlight the warnings from people who worked in tech about social media. One expert said that it is

“using your own psychology against you.”

Having worked in tech myself, I have read the books and received the training on how these social media giants get us hooked—it is built in.

Awareness is growing. I thank Smartphone Free Childhood, Health Professionals for Safer Screens, the Molly Rose Foundation, the Internet Watch Foundation and the Online Safety Act Network, along with projects such as Digital Nutrition—the hon. Member for Milton Keynes Central (Emily Darlington) and others have made the analogy of an online diet—that have worked to ask what the guidance should be. Those are just a few of the organisations I could name that have worked tirelessly to ensure these voices are heard.

I also thank pupils in my constituency from Roundwood Park, St George’s, Sir John Lawes, Berkhamsted and Ashlyns schools, and students who have openly shared their experiences, hopes and concerns about the online world. Their concerns are not just about content; they are also about addiction. Let me be clear: as my hon. Friend the Member for Mid Dunbartonshire (Susan Murray) mentioned, the core of this issue is that this is the attention economy, so our children are the product. Their attention, focus and time are being sold to line the pockets of tech billionaires. Governments around the world are taking finally action. This is a seatbelt moment where we need to say, “Enough is enough.”

The hon. Member for Stoke-on-Trent Central (Gareth Snell) talked about trying to get this right. I respect that, but I often think that if we were able to walk down the street and see a 3D version of what young people are seeing in their online world, action would have been taken much sooner. My hon. Friend the Member for Eastleigh (Liz Jarvis) talked about holding tech companies to account. We need to start unpacking what children are seeing and finally take action.

The Online Safety Act has done great work, but it does not go far enough. It sets out illegal harms and a code for inappropriate content for children and over-18s, but not a framework of legal harms or age-appropriate content. The social media age of 13 is based on data processing that is managed by the Information Commissioner’s Office and has nothing to do with what is age-appropriate in that context. Dr Kaitlyn Regehr, the author of “Smartphone Nation”, talks about how the Act is reactive, not proactive, and leaves it up to the user to report problems rather than putting the burden of safety on tech giants.

We must ensure that we build on the OSA and learn the lessons from Australia. The hon. Member for Milton Keynes Central talked about this. In Australia, a wide definition of social media has left it to a small group to decide what is appropriate. That has meant that YouTube has been banned for under-16s, but YouTube Kids has not, with no real framework for why apart from the fact that they deem YouTube Kids safer. WhatsApp has not been banned, which is possibly the right thing, but legislators are left to play whack-a-mole as new social media apps pop up. There is no framework for harm from AI.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the hon. Member give way?

Victoria Collins Portrait Victoria Collins
- Hansard - -

Very briefly; I want to leave the Minister time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Australia just bans children from holding accounts; it does not ban them from using any of the platforms. They can still use YouTube; they just cannot have an account.

Victoria Collins Portrait Victoria Collins
- Hansard - -

Absolutely. YouTube is everywhere. It is embedded in almost every website that has videos.

The hon. Member for Aberdeen North (Kirsty Blackman) asked about AI chatbots. In the proposals we put forward in the Lords, the user-to-user services are the AI chatbots. We have highlighted for a long time that potential harms from AI chatbots are not covered. That is absolutely the case, but Ofcom has clarified that AI chatbots are the user-to-user service. The harms, such as AI psychosis, which my hon. Friend the Member for Winchester (Dr Chambers) alluded to, are not covered. That is why the harms-based approach we are putting forward is so important.

As my hon. Friend the Member for Twickenham (Munira Wilson) said when she opened the debate, the Liberal Democrats have been leading the work on online safety in this Parliament. We were the first party to push a vote on banning addictive algorithms. We have called for health warnings and a doomscroll cap. Today, we are calling for a vote on the age for social media and online harms. We are calling for a ban on harmful social media based on a film-style age rating. That harms-based approach holds tech companies to account, sets a pioneering approach to online standards and prepares for the future of AI chatbots and games like Roblox, which has already arrived.

In the offline world, anyone buying a toy for young children at this point would expect age ratings so that they know it is appropriate and safe, and films have had age ratings for over 100 years, yet we have not had that in the online world. The harms-based approach is backed by 42 charities and experts who work to protect children, stop violence against women and girls and make the internet a safer place.

We are also calling for a reset, because enough is enough. That includes a minimum age of 16 for social media and real accountability for tech companies with film-style age ratings. We need to make sure that we get the best out of the internet for young people and protect them from harms.

For me, it comes back to James, his friends and the young women and children I have spoken to around my constituency. We do not have time to waste—that is why we are pushing for these Bills. We are calling for action, and I call on MPs across the House to put children before politics, exactly as we did in the Lords. The amendment in the Lords could mean a blanket ban. We were uncomfortable with that approach—we much prefer ours—but we knew that the future of children came first. We must help the next generation to get the best of the online world—including those young people who have spoken out and shared their concerns and horror stories—and protect them from the worst of it.