Online Harm: Child Protection

Chi Onwurah Excerpts
Tuesday 24th February 2026

(1 day, 8 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Caroline Nokes Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

I thank the hon. Member for his point of order. The motion on the Order Paper is perfectly orderly, so Members will be invited to vote on that, not on the substance of any Bill that might come on 9 March. I think it is important that the House is clear on that.

Chi Onwurah Portrait Dame Chi Onwurah (Newcastle upon Tyne Central and West) (Lab)
- Hansard - -

Further to that point of order, Madam Deputy Speaker. How can I assess what is orderly for my contribution to the debate given that the substance of the motion is about process? To be frank, I do not want to speak about process; I want to speak about protections for children.

Caroline Nokes Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

The motion is to give consideration to a Bill on the specific matter which has been outlined clearly on the Order Paper: “Protections for children from online harms”. I reassure the hon. Lady that any contribution she chooses to make on that matter would be in order.

--- Later in debate ---
Chi Onwurah Portrait Dame Chi Onwurah (Newcastle upon Tyne Central and West) (Lab)
- View Speech - Hansard - -

I am grateful to the Liberal Democrats for bringing forward this debate on protecting children from online harms, although I remain uncertain as to the measures they are proposing. This debate is happening up and down the country, in homes and at school gates—indeed, wherever people gather—so it is right that we debate it here. If the Conservatives had done something during their critical 14 years of power, our children would be better protected now, but they did not, so it falls to us to take action.

I am going to speak about three things: online platforms, their history and approach; the work of my Select Committee, the Science, Innovation and Technology Committee, on algorithms; and the work of the Committee on digital childhood, all within the context of protecting children from online harms.

The key online players range in age from pre-teen—TikTok was founded in 2016—to their late 20s, as Google was founded in 1998. In human terms, these platforms are just entering or leaving adolescence, and it shows.

As hon. Members across the House may have heard me mention, I am an engineer—chartered, as it happens; thanks for asking—and my last job before entering this place was head of telecoms technology for Ofcom. I remember meeting people from a US platform, which shall remain nameless, around 2005. The company executive commented that they had come to the UK from silicon valley on a six-month contract to sort out Government affairs, and they could not understand why, two years later, discussions were still ongoing. Did we not realise that Government had no role in what they did?

I say that to illustrate that tech platforms have their origins in a libertarian, small/no-government tech bro bubble that has spread globally. TikTok, as a Chinese company, has a different background, but public accountability is not necessarily part of it. Unfortunately for all of us, the Conservative-Lib Dem Government of 2010 and their successors shared the view that Government should not be a part of it, which is how we arrived in 2024—20 years later—without online harms regulation, while at the same time the use of social media and life online has exploded. That is why I consider the Tory position in this debate to be a superb example of hypocrisy.

Monica Harding Portrait Monica Harding (Esher and Walton) (LD)
- Hansard - - - Excerpts

The hon. Lady is making a powerful speech about the evolution of social media platforms. I have four children; the first was born in 2004 and the last was born in 2011, so their births have spanned that evolution. Facebook began in 2004; TikTok began in 2016. If that evolution was the industrial revolution, we would be around the spinning jenny stage, with AI chatbots the next destination. Those chatbots are terribly dangerous for our children, and we need to regulate them now. That should be within the Online Safety Act.

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - -

I agree that AI chatbots are a further evolution, and I think we should learn from the lack of effective regulation under the Conservatives during that critical period in the evolution of the internet in how we approach AI. I agree with the hon. Lady that AI chatbots should be brought into the regulatory environment of the Online Safety Act.

Matt Rodda Portrait Matt Rodda (Reading Central) (Lab)
- Hansard - - - Excerpts

My hon. Friend the Chair of the Select Committee is making an excellent speech. Her background in this area is really showing in the detail with which she is exploring these issues. Part of the challenge here is that we as parents are struggling to catch up with this revolution, which is gaining speed all the time. Perhaps my hon. Friend would highlight some of the challenges that parents face. For me, part of the importance of the consultation is to allow parents to think more deeply about this difficult issue; there are often different opinions from campaigners who have had the most painful experiences.

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - -

My hon. Friend makes an excellent point. It is for that exact reason that I support a consultation: this is part of a debate, and we all need to improve our understanding of the impacts of this technology. Parents are in a difficult position. I do not believe parents should have to be technology experts in order to give their children the best start in life, but unfortunately there is so much pressure in the online world that that seems to be the case right now, and that is why it is right that Government take action and consult on the action they take.

Let us think about the evolution of these technologies. I remember that when I joined Facebook in 2005 I had to use my university email address to join—that meant I had to be over 18. Some 20 years later, 13-year-olds and younger are having their lives and brains formed by almost uninhibited access to social media. In the UK, the number of social media users has gone from practically zero to four fifths of the population. I have worked with the Molly Rose Foundation, a charity established by the Russell family after their daughter Molly took her own life at the age of 14 following exposure to self-harm content online; I have spoken to the bereaved parents of children bullied to death online; and I have spoken to the Internet Watch Foundation about the horrendous images its staff see of child exploitation. The fact that the Conservatives did nothing in all those years in government is, in my view, a form of political negligence of the highest order.

As part of my Committee’s inquiry into social media and algorithms, Google, Meta, TikTok and X told us that they accepted their responsibility to be accountable to the British people through Parliament, which I thought was quite a step forward from previous utterances, and ongoing utterances, by some tech billionaires who shall remain nameless. Our inquiry found that our online safety regime should be based on principles that remain sound in the face of technological development. Social media has many important and positive contributions, including helping to democratise access to a public voice and to connect people far and wide, but it also has significant risks—and those risks can evolve with the technology. We spoke about AI as an evolution, and one of the main failings of the Online Safety Act is that it regulates particular services rather than establishing principles that remain true and can be part of a social consensus as technology evolves.

Bobby Dean Portrait Bobby Dean
- Hansard - - - Excerpts

The hon. Lady is making an excellent speech. Should one of those principles be related not only to content but to the addictive nature of these platforms? One of the changes I have witnessed on social media over time is algorithmic addiction. The greatest minds in the world are now working out the circuitry of our brains and driving content towards us so that we look at our screens for longer so that they can sell more ads. Does she agree with that point?

--- Later in debate ---
Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - -

I really thank the hon. Member for that intervention, because that is exactly one of the recommendations of the Committee’s inquiry. As he says, the advertisement-based business models of most social media companies mean that they promote addictive content regardless of authenticity. This spills out across the entire internet via the unclear, under-regulated digital advertising market, incentivising the creation of content that will perform well on social media, as we saw during the 2024 unrest following the horrendous Southport attacks.

This is not just a social media problem, though. It is a systemic issue that promotes harmful content and undermines public trust. The Committee identified five key principles that we believe are crucial for building public trust. The first is public safety. Public safety matters; I hope it is not necessary to debate that. The second is free and safe expression, which is also very important. The third is responsibility on the part of the platforms. Right now, they have no legal responsibility for the content they amplify; they just have to follow their own processes in certain specific cases. Our fourth principle involves control, and the fifth and final principle is transparency. We made detailed recommendations on regulating the advertising-based business model so that amplification would not be incentivised in the way that was outlined by the hon. Member for Carshalton and Wallington (Bobby Dean). We also recommended a right to reset—the right of a person to remove their data from any algorithm.

Our report came out not long before the Minister took up his position. The Government accepted all our conclusions but none of our recommendations. I urge them to look again at our recommendations and to consider implementing them, or at least to respond and tell me why they are still not to be implemented. I welcome the Government’s recent actions and interventions and their readiness to intervene. As I said, the consultation is critical. I welcome the desire to promote a consensus and to take measures to ensure swift delivery of the consultation conclusions through the Children’s Wellbeing and Schools Bill. The consideration of the inclusion of AI chatbots is important, as is addressing the risky features in certain models, as well as providing support for bereaved parents. The Committee looks forward to working with the Government to try to achieve their aims. We need evidence to drive policy and regulation based on principles that the public can have confidence in.

Natasha Irons Portrait Natasha Irons
- Hansard - - - Excerpts

I wanted to intervene on the point about principles, content and responsibility. I worked for Channel 4 before I came to this place, and we were regulated by Ofcom. Channel 4 did not create its own content, but was responsible for the editorialisation of that content. It was beholden to certain standards. Does she agree that we should be holding these media companies—they are not now “new media” companies, but legacy media companies—just as responsible for the content they put out on their platforms as any broadcaster?

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - -

My hon. Friend makes an important point; the insight she brings from her career in the media is critical. For many years, while the platforms were just that—platforms on which other people placed content—there was an argument that they should not be regulated and that they did not have a responsibility for the content on them, but they are at the very least active curators of that content now. Algorithms effectively form digital twins of individuals and then drive individualised content at them. That requires a responsibility. The time is right, as our Committee recommended, to ensure that platforms have responsibility for their content.

The Science, Innovation and Technology Committee will be holding a one-off session on social media age restrictions on 11 March to feed into the Government’s consultation on measures to keep children safe online and to hear from social media companies on their progress in the last year. We will also gauge the strength of the evidence for and against an age-based ban on social media, as well as any evidence relating to proposed alternatives to a ban. In doing so, we will hear from experts and representatives of those with direct experience of harms. We want to hear from both sides of the debate in the UK and will be seeking evidence from Australia on the first few months of the ban that is already in force there. We will be hearing from major social media and technology companies in a follow-up to our algorithms and misinformation inquiry, and we will ask for their views on the proposed age limits.

Finally, the work on social media age restrictions will feed into a larger inquiry on the neuroscience of digital childhood, which we will launch in the coming weeks. We want to find out how young people spending their formative years online affects their brains and what the Government should do to protect them from any negative impact. That could cover the impact of social media and other screentime on brain development, behaviour, and physical and mental health, whether positive or negative. It could also cover the physiological impact on eye development, the impact on socialisation and what actions Governments should take. There is a consensus on the need to do something, but not on what needs to be done. That is why we are seeking to provide evidence.

I always say to the platform companies that the opposite of regulation is not no regulation, but bad regulation. More regulation is coming. Several US states, such as California, have brought in new regulation on big tech. The Spanish Prime Minister has called social media a

“failed state where laws are ignored and crimes are tolerated”.

There is also the increasingly significant issue of technology sovereignty and whether we are too dependent on foreign companies for our online environment. I call myself a tech evangelist, and I am, but I also know how much an engineer costs. The starting salary of an AI engineer—if companies can find one—is well over £100,000 a year. Tech companies are not going to put them to work on protecting and keeping our children safe unless the House puts the right incentives in place. With all due respect to the Minister and the Online Safety Act, which he inherited, they are not in place now.

--- Later in debate ---
Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

I appreciate the right hon. Gentleman’s intervention. [Interruption.] I am sorry to upset my hon. Friend the Member for Stoke-on-Trent Central (Gareth Snell). The Government are acting at pace, but we want to act in the right way. We must act in the right way because this is such a complex and serious issue. It is important for children to be able to seize the opportunities that being online can offer. We have heard about iPad-only schools. Parents must be confident that their children are safe—that is key. If we do not want to exclude children from age-appropriate services that benefit their wellbeing, we must act on the evidence and ensure that we strike the right balance between protecting children’s safety and wellbeing, and enabling them to use technology in positive and empowering ways.

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - -

Does my right hon. Friend share my disappointment that, in this debate on protecting children from some of the most obscene abuse, not one Reform Member is present?