Online Harm: Child Protection

Kirsty Blackman Excerpts
Tuesday 24th February 2026

(1 day, 8 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I have given way to the hon. Member a couple of times. I am just about to finish.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I am confused about what the Liberal Democrats’ proposals are. The proposals laid out by the hon. Lady are not those introduced in the House of Lords. In the House of Lords, only user-to-user services were talked about, not addictive online gaming, for example. Are we discussing a Bill containing the proposals laid out in the House of Lords, or is the hon. Member putting forward new, ethereal proposals? I do not understand what this Bill is going to be. I was expecting to actually see it so that we could discuss it today.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

The harms-based framework that we proposed in the other place would apply to chatbots and gaming as well. The point is that, as I have already laid out, we would come together and come forward with proposals that we can all agree on.

--- Later in debate ---
Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I will simply repeat the point I have made, which is that we are going to act by the summer. We have already sought permissive powers to ensure that the Government are able to act on the outcome of the consultation through rapid legislation. I hope the combination of those two commitments gives the hon. Member some assurance.

The engagement and consultation will take place alongside work with counterparts. We will be monitoring developments in Australia on its social media ban for under-16s to share learnings and best practice. We are steadfast in our belief that the right way to deliver the next steps to protect our children online is to be led by the evidence through our short, sharp three-month consultation.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Minister has just said that the Government have already sought permissive powers. I understand that they are going to move an amendment in lieu to the Children’s Wellbeing and Schools Bill, but I am not aware that that amendment has been published yet, much less agreement sought from the House. When will that be published, so that we can see what those permissive powers are supposed to be?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I thank the hon. Member for that point, and commit to her that we are going to try to do that as soon as possible. She will be aware that the legislative process is already very tight, so I will come back to her and the House with the wording of the motion as soon as possible.

Last week, as I have mentioned, the Secretary of State confirmed that we will take new legal powers to allow us to act quickly on the outcomes of the consultation, delivering on our promises to parents. We will make sure that the wording is presented to the House at the earliest opportunity. We also recognise the importance of parliamentary scrutiny and the expertise that parliamentarians in both Houses provide, and have already committed that when regulations are brought forward, they will be debated on the Floor of the House and there will be a vote in both Houses, ensuring proper scrutiny. We are clear that the question is not whether we will act, but what type of action we will take. We will ensure that we do so effectively, in lockstep with our children and in the interests of British families.

--- Later in debate ---
Julia Lopez Portrait Julia Lopez (Hornchurch and Upminster) (Con)
- View Speech - Hansard - - - Excerpts

Today we are debating something that is very important: the protection of children from online harms is vital.

I commend the hon. Member for Twickenham (Munira Wilson) on what I thought was a very heartfelt speech, but I fear that her good intent has been rather thrown under the bus by her party leadership. Setting aside the importance of this subject, let us look at their method of bringing it forward—a point which has been raised rather expertly by Members from across the House. Today the Liberal Democrats are doing what they do best: slightly nutty stunts. With all the menace of Captain Mainwaring they are attempting to seize control of the Order Paper and effectively declare themselves not only Government for the day but, with their loosely defined online services Bill, rulers of the internet. It is a gimmick. It is the parliamentary equivalent of boinging into the Chamber on a giant bungee.

Though the hon. Member for Twickenham put a little bit of flesh on the bones in her speech, the motion itself simply requests the power to barge through this House with a blank-cheque Bill for which we have no details and in so doing let the Government Benches clean off the hook. It has all gone a bit Benny Hill. It is a great shame because it is a distraction when the moment of truth on social media for children is coming to us imminently. They know that from the panicked recess briefings that the Prime Minister has been caught on the hop on an issue that is of deep concern to families, children, teachers and communities across the country.

Before too long the Children’s Wellbeing and Schools Bill will return to this House and Members will have the chance to vote on a credible proposition: an amendment tabled by the noble Lord Nash that no child under the age of 16 should have access to harmful social media.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

If this is the Conservative’s stance, why when consideration of the Online Safety Bill lasted for so long—it was even referred back into Committee, which no Bill had been in 20 years—did the Conservatives not ban social media for under-16s through that Bill when they were in government?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

This is a Conservative amendment in the Lords that has gained cross-party support, so it will be coming back to us. The hon. Member raises an important point about why this policy was not brought in under the Online Safety Act. That Act tried to do many, many things. In many ways, it took so long because it risked becoming a Christmas tree Bill, and many good causes were hung off it. That did cause challenges.

I think that as the debate has moved on we have realised that it is not just about illegal content that children are being exposed to and some of the things that the Online Safety Act was trying to change. There is an issue in general about children being in this space: there are addictive algorithms, and it is not just about illegal material but the fact that it is changing how children are thinking about interacting. Maybe we have to stand back as a society and say, “This is simply not the right place for children to be. We can create adult online spaces, but for children we think that there are other ways in which they should be interacting with the world.”

--- Later in debate ---
Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I can agree with that. My point is that this Government are trying to suggest that a consensus can be found in the absence of their having a policy position. They are talking about a consultation, but what on earth are they consulting on? Nobody has a clue. They have not been able to say anything about what they actually want to do, because the Prime Minister has no opinions, which is why he is in such deep trouble. Those on the Labour Benches can get out of their tree and get all uppity about it, but this—[Interruption.] No, the Prime Minister is being blown around like a paper bag on this issue, and everybody knows it. First of all, he said that his children did not want to ban social media; now he says that his children are the reason why he wishes to ban social media. He said there is going to be a consultation, but it has not materialised. What does this man actually think?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am glad that the hon. Member has been very clear that her position is that she supports the Lords amendment that seeks to ban social media for children. Is she aware that it would not apply in Scotland? The Lords amendment would not apply in Scotland, because the territorial extent of the Children’s Wellbeing and Schools Bill, apart from one clause, does not include Scotland. I take it that her position is that she only wants a social media ban for children who do not live in Scotland.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I am sure the applicability of the legislation in Scotland is something that can be debated when the Bill comes before the House.

To give them credit, many Labour MPs understand the fact that there is an absence of any Government position, and they will not be taking their foot off the pedal. I suspect that many may have the guts to speak out today—although perhaps not. Those MPs recognised immediately that a consultation is a mechanism for a delay that goes beyond the summer and into another parliamentary year before the sniff of legislation. That holding position is now falling apart, as we have seen from the Minister here today. It is the threat of a very large group of Labour MPs backing the Conservatives’ Lords amendment that is pushing this Government into action—it is government by rebellion. We ask the Liberal Democrats not to let us be distracted from the moment of truth that is coming up, when we hope there will be cross-party support for the noble Lord Nash’s amendment.

For too long, the internet has been treated as a space that cannot be governed. It has functioned like a pioneer society, with extraordinary opportunity but minimal rules. However, pioneer societies improvise customs and eventually retrofit themselves with rules to sustain societies, often after hard-won experience and dispute. That is the process through which we are now going, and we are realising that, as the online society was built, we were not vigilant enough when it came to protecting childhood. We did not recognise that this new territory would bleed into the old world. [Interruption.] The Minister is shouting from the Front Bench that I am embarrassing myself. We as a Government brought forward the Online Safety Act, but there are gaps in it, and we have taken a clear position as the Opposition that we think children should not be on social media. He is looking very angry, but what is his view? Can he stand up and tell us what his personal view is? As the Minister with this responsibility, what does he think should be done, having launched his consultation with such earnestness? Come on, tell us! Would he like to tell us?

--- Later in debate ---
Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I am sure that the issue of the functionality list can be explored as time goes by.

It is important to point out that this is not a moral panic but a structural problem. Today the Leader of the Opposition gathered a panel of grieving parents who had lost their children, and in that context negative online activity was recognised to have real-world and utterly tragic consequences. The children had been drawn into dangerous challenges, coercive relationships, bullying and bribery, all of which created despair in those young minds.

That showed us plainly why the pioneer phase must now come to an end, at least where children are concerned. Pioneer societies do not remain lawless forever; eventually they are retrofitted with rules and boundaries, and protections for the vulnerable. It is striking that, after years of the problem building up, countries around the world are reaching the same conclusion with remarkable synchronicity—not because it is fashionable, because Governments are copying one another or because anyone thinks that this will be particularly easy to impose and enforce, but because the evidence has accumulated to a point at which denial is no longer credible. If social media were broadly harmless for children, this would not be happening, but Governments with very different. political traditions are acknowledging the same reality: that when it comes to children, some control must be wrested back. I suspect that this trend will be reflected vividly in the Chamber today, with examples from across the nation of what is happening in the real world because of the laxity in the online world.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I asked the hon. Lady’s Government to ban suicide forums that encourage young people to harm themselves. I asked her Government to ban eating disorder forums that encourage eating disorders. Her Government refused to do that in the Online Safety Act 2023, despite our asking for it to happen. How can she stand there now and take the moral high ground when her Government refused to ban the worst, most egregious, most harmful platforms? The Conservatives do not have a moral high ground on this issue.

--- Later in debate ---
Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I absolutely agree. Young people, particularly those in the mid-teenage years, understand this issue in a way that sometimes we do not because, quite frankly, our online experience is completely different from theirs. If Members want to test that, they should open an app such as Pinterest and compare what is fed into their Pinterest boards with their child’s Pinterest boards. It is a completely different experience. If Members do not have children, they should ask younger member of staff to open the same app on the different phones, and they will see a completely different world.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

A local organisation in my constituency, CyberSafe Scotland, surveyed children about what they were being fed on TikTok. There is a road in my constituency called North Anderson Drive, and children on one side of North Anderson Drive were being fed different content to the children on the other side of it. It is not just an age thing; it is really specific, and we cannot understand what each individual person is seeing because it is different for everybody.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

That is a very important point about how sophisticated the technology has become. When we ask companies to take action to stop outcomes, the technology exists to do that. We are not asking them to reinvent the wheel or come up with new technology. It already exists because they are even microtargeting two different sides of the road.

Having discussed this with experts, parents and—most importantly—young people, what do I think we need to consider? First, we need to fully and properly implement the Online Safety Act 2023. That must be done at speed, and it requires nothing from the House. It has been a request of the Secretary of State and the Minister, and I recommend that Ofcom gets on and does that as quickly as possible. We must make safe spaces for children online. How do we do that? Part of the answer is ensuring that content is related to ratings that we already understand as parents, such as those from the British Board of Film Classifications. I have been asking YouTube what rating YouTube Kids has for about a year now. Is it rated U? Is it 12A? Is it 15? It cannot tell me because it does not do things on that basis.

As a parent I want to know the rating before allowing my children on an app, because parents have a role in this as well. All apps should be rated like videogames. Roblox has a 5+ rating, which does not exist in videogame ratings. We see ratings such as 4+ or 9+, but those are made up. At the parents forum that I did after the survey, one parent said that she walked in on her nine-year-old playing “guns versus knives”—on an app that is rated 5+. The ratings on apps mean nothing, yet we have video game ratings that we as parents understand, so why are they not used? Should in-app purchases ever be allowed for young children? What is the age at which in-app purchases should be allowed in a game?

We must consider the time limits for the different stages of brain development. We have guides on fruit and vegetables that recommend five a day to parents. We all know that. Schools use the same language, we use the same language, yet we have nothing to support parents in deciding how long a child should be online at different stages of brain development. I hope that the evidence that the Science, Innovation and Technology Committee collects will help inform that.

We need to change addictive and radicalising platform algorithms. To protect children from child sexual abuse images, we need to talk to those behind iOS and Android to stop the creation of self-generated child sexual abuse images—some 70% to 80% of child sexual abuse images are self-generated—and we need to stop end-to-end encryption sites from sharing them. We have technology that can do that. We should always keep the ability to ban in our pockets, but any ban should be for particular apps. We should not ban our children and young people from having an online experience that is good.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

This makes me more frustrated than just about anything else in this place: the levels of ignorance, stupidity and hypocrisy from so many people in here, specifically about children’s access to social media. I fully intended to support the Lib Dems’ position, but the longer their spokesperson, the hon. Member for Twickenham (Munira Wilson), spoke, the less I wanted to do so.

I do not believe that the Government’s position on this is 100% right. I am glad that they are having a consultation, but I do not like the way that they are amending the Children’s Wellbeing and Schools Bill, which is a devolved Bill, to change the territorial extent to bring that into scope. A Bill that we have not scrutinised, because it is a devolved Bill, will now have a reserved section in it. At the moment, the Bill does not apply to Scotland, apart from one clause; now, it will apply to Scotland, because it will include this. As we have not had the opportunity to scrutinise it, we have not been involved in that process. I do not think it is right that the proposed amendment should come forward in this way, although I appreciate why the Government are doing it. That is why I am asking for the amendment to be shared with us as soon as possible so that we can see it, because we have not had a chance to look at the Bill as it has gone along.

The Lib Dems have said that they have made their position clear. I have so far been able to find three amendments and new clauses to the Online Safety Act put forward by Lib Dems during its passage through the House. One of them was put forward by the Lib Dem spokesperson, who asked for an independent evaluation within 12 months of whether more platforms should be subject to child safety duties—this is the same party that is currently accusing the Government of kicking the can down the road, despite asking for a 12-month independent evaluation. There is hardly anything in the Lib Dems’ previous positions that helps me to understand their current position.

The Tory party’s position is totally incoherent, too. The Tories refused my amendment on reducing habit-forming and algorithmic features. They also refused my amendments on livestreaming.

By the way, before the Minister’s “Dear colleague” letter, livestreaming had been mentioned 53 times across the two Houses. A third of those mentions were me talking about how livestreaming for children should be banned. Before today, Roblox had been mentioned 32 times across both Houses—15 of those mentions were me saying that Roblox is not a safe platform for children.

I am massively in favour of improving the online world for children. I think social media should be about looking at videos of cats. I love videos of cats—they are absolutely brilliant. That is what it should be for. I also think it is a great place for children to interact with one another.

Like some others in the Chamber, I have been making the case that there are dangers on social media that can be easily tackled by changing the Online Safety Act. We could have got rid of those algorithmic features for children, for example. We could have got rid of livestreaming for children through the amendment I tabled. We could have got rid of children’s access to private messaging features with people they do not know, through another amendment that I tabled to the Online Safety Act.

I do not like the way the Government are doing this, though. They are proposing an amendment to the Children’s Wellbeing and Schools Bill, and then we will have secondary legislation that will, possibly, amend the Online Safety Act—I am not 100% clear on how it is going to go. I appreciate that there needs to be a consultation.

Before the 2024 Parliament, there were about three people in this entire place who had any grip of what the online world might have been like for children. One of them was the hon. Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah), who talked about some of these things. I asked the Minister at the time whether Fortnite would be included in the scope of the Online Safety Act, and they said, “If there’s text chat.” Text chat in Fortnite—it is an online game! There is not enough expertise in this place. Much as I hugely appreciate the people who work on writing Bills and the work of some of the experts at Ofcom, they are not experiencing the online world that children are experiencing. That is why we need to listen to ensure that any changes that are made tackle the most harmful behaviours, places and functionalities on the internet.

I appreciate that the Government are trying to take action on this now. However, one of the few things that has made me cry in frustration in this place was one of the first things this Government did when they came in, when they brought in secondary legislation to categorise platforms and refused to include the small, high-risk platforms that had been added in the House of Lords. They said they were categorising as category 1 only platforms like Facebook, which meet a certain threshold. I was so frustrated by that choice by the Government.

There needs to be more listening and learning about where the actual dangers are, and taking action on them. Please, do that in consultation with those of us who do understand this. Please, listen to experts on this.

--- Later in debate ---
Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- View Speech - Hansard - - - Excerpts

I have been quite shocked at some of the procedural discussion for several reasons. First, we are acting like this has just come up, but even in the House of Commons under this mandate, as my hon. Friend the Member for South Devon (Caroline Voaden) mentioned, the safer phones Bill was put forward in 2024. As Liberal Democrats, we put forward amendments to change the age of data consent to ban addictive algorithms. There have also been calls to act on doomscroll caps, and we have highlighted the harms of AI chatbots. Yet we are at a point—I absolutely respect what the hon. Member for Aberdeen North (Kirsty Blackman) was saying on this—where a consultation was proposed by the Government over a month ago, but we still do not know the details. There are things going through the House of Lords that, again, we do not know the details of. At the very least, Liberal Democrats are trying to give the space for that and say, “Yes, we need to start putting forward that legislation.” If there is another chance to debate that, what is the harm in this motion because this is such a crucial issue?

Secondly, it is not as if this is an issue that turned up yesterday. As the hon. Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah) talked about, these harms have been happening for years—over 22 years for Facebook. I will go on to say more about that in a moment. Other countries around the world are showing leadership on this and saying that we have to act now. My point is that at the very least, a consultation could have been launched earlier. This is not something new in this Parliament. We are saying that action needs to be taken.

Most importantly, the parents, children and experts watching this debate want to see us taking this issue seriously. Children and young people are at the heart of this. I think back to the first time I met some of the sixth-form students at Ashlyns school in Berkhamsted. I will never forget sitting around that table with one sixth-former—let’s call him James. He told me about his fears for the mental health of his friends. He warned about the self-harm that he was seeing among his peers, which his teachers were not even aware of, and he talked about the role of social media. A few weeks later, I was pulled to one side at St George’s school in Harpenden, where some young women shared with me their concerns about the growing misogyny lived out by young men, which started on social media.

Since then, I have carried out a “Safer Screens” tour meeting young people. Students have talked about brain rot and seeing the extreme content that the algorithm continues to push on them, even when they try to block it—the hon. Member for North West Cambridgeshire (Sam Carling) talked about that. One student said, “It is as addictive as a drug”, and they see the harms of it every day.

This is the tipping point, and I am surprised that many Members think that it is not. This is that moment. Parents, teachers, experts and even young people are crying out for action, and have been for a long time, to tackle the social media giants that have no care for their mental health. As I said, this tipping point has been years in the making. Facebook was launched 22 years ago. Indeed, a Netflix documentary from six years ago started to highlight the warnings from people who worked in tech about social media. One expert said that it is

“using your own psychology against you.”

Having worked in tech myself, I have read the books and received the training on how these social media giants get us hooked—it is built in.

Awareness is growing. I thank Smartphone Free Childhood, Health Professionals for Safer Screens, the Molly Rose Foundation, the Internet Watch Foundation and the Online Safety Act Network, along with projects such as Digital Nutrition—the hon. Member for Milton Keynes Central (Emily Darlington) and others have made the analogy of an online diet—that have worked to ask what the guidance should be. Those are just a few of the organisations I could name that have worked tirelessly to ensure these voices are heard.

I also thank pupils in my constituency from Roundwood Park, St George’s, Sir John Lawes, Berkhamsted and Ashlyns schools, and students who have openly shared their experiences, hopes and concerns about the online world. Their concerns are not just about content; they are also about addiction. Let me be clear: as my hon. Friend the Member for Mid Dunbartonshire (Susan Murray) mentioned, the core of this issue is that this is the attention economy, so our children are the product. Their attention, focus and time are being sold to line the pockets of tech billionaires. Governments around the world are taking finally action. This is a seatbelt moment where we need to say, “Enough is enough.”

The hon. Member for Stoke-on-Trent Central (Gareth Snell) talked about trying to get this right. I respect that, but I often think that if we were able to walk down the street and see a 3D version of what young people are seeing in their online world, action would have been taken much sooner. My hon. Friend the Member for Eastleigh (Liz Jarvis) talked about holding tech companies to account. We need to start unpacking what children are seeing and finally take action.

The Online Safety Act has done great work, but it does not go far enough. It sets out illegal harms and a code for inappropriate content for children and over-18s, but not a framework of legal harms or age-appropriate content. The social media age of 13 is based on data processing that is managed by the Information Commissioner’s Office and has nothing to do with what is age-appropriate in that context. Dr Kaitlyn Regehr, the author of “Smartphone Nation”, talks about how the Act is reactive, not proactive, and leaves it up to the user to report problems rather than putting the burden of safety on tech giants.

We must ensure that we build on the OSA and learn the lessons from Australia. The hon. Member for Milton Keynes Central talked about this. In Australia, a wide definition of social media has left it to a small group to decide what is appropriate. That has meant that YouTube has been banned for under-16s, but YouTube Kids has not, with no real framework for why apart from the fact that they deem YouTube Kids safer. WhatsApp has not been banned, which is possibly the right thing, but legislators are left to play whack-a-mole as new social media apps pop up. There is no framework for harm from AI.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the hon. Member give way?

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Very briefly; I want to leave the Minister time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Australia just bans children from holding accounts; it does not ban them from using any of the platforms. They can still use YouTube; they just cannot have an account.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Absolutely. YouTube is everywhere. It is embedded in almost every website that has videos.

The hon. Member for Aberdeen North (Kirsty Blackman) asked about AI chatbots. In the proposals we put forward in the Lords, the user-to-user services are the AI chatbots. We have highlighted for a long time that potential harms from AI chatbots are not covered. That is absolutely the case, but Ofcom has clarified that AI chatbots are the user-to-user service. The harms, such as AI psychosis, which my hon. Friend the Member for Winchester (Dr Chambers) alluded to, are not covered. That is why the harms-based approach we are putting forward is so important.

As my hon. Friend the Member for Twickenham (Munira Wilson) said when she opened the debate, the Liberal Democrats have been leading the work on online safety in this Parliament. We were the first party to push a vote on banning addictive algorithms. We have called for health warnings and a doomscroll cap. Today, we are calling for a vote on the age for social media and online harms. We are calling for a ban on harmful social media based on a film-style age rating. That harms-based approach holds tech companies to account, sets a pioneering approach to online standards and prepares for the future of AI chatbots and games like Roblox, which has already arrived.

In the offline world, anyone buying a toy for young children at this point would expect age ratings so that they know it is appropriate and safe, and films have had age ratings for over 100 years, yet we have not had that in the online world. The harms-based approach is backed by 42 charities and experts who work to protect children, stop violence against women and girls and make the internet a safer place.

We are also calling for a reset, because enough is enough. That includes a minimum age of 16 for social media and real accountability for tech companies with film-style age ratings. We need to make sure that we get the best out of the internet for young people and protect them from harms.

For me, it comes back to James, his friends and the young women and children I have spoken to around my constituency. We do not have time to waste—that is why we are pushing for these Bills. We are calling for action, and I call on MPs across the House to put children before politics, exactly as we did in the Lords. The amendment in the Lords could mean a blanket ban. We were uncomfortable with that approach—we much prefer ours—but we knew that the future of children came first. We must help the next generation to get the best of the online world—including those young people who have spoken out and shared their concerns and horror stories—and protect them from the worst of it.