Online Harm: Child Protection

Natasha Irons Excerpts
Tuesday 24th February 2026

(1 day, 8 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Natasha Irons Portrait Natasha Irons (Croydon East) (Lab)
- Hansard - -

I thank the Minister for the decisive action that he took over the recent Grok incident. Given the scope of the consultation and the fact that we are talking about online harms, I want to flag the issue we have around content on YouTube, which is a video-sharing platform, not necessarily a social media platform. The type of content that our children are consuming on there is a quick succession of images, which is not very good for a child’s development, rather than the slow-paced stuff we get when we watch a broadcaster. Will the consultation look at the quality of content on these platforms? Not all screentime is equal; some screentime can be quite dangerous for a child’s development in general.

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

Both of my hon. Friend’s points—on the scope of how we look at particular platforms and at their functionalities—are not just considered by the consultation, but deeply important. I engaged with the Australian Minister on this issue just last week, trying to understand their experiences of this and the uncertainty of getting those two things right. That is exactly why the consultation has been an appropriate approach in this context.

Where services fail to comply with their duties in the Act, Ofcom’s enforcement powers include fines of up to £18 million or 10% of qualifying worldwide revenue. Ofcom has indicated that it has issued financial penalties to six companies under the Online Safety Act amounting to more than £3 million. I can confirm to the House that just yesterday, Ofcom announced that it has fined a porn company £1.35 million for failing to introduce proper age verification on its websites—the largest fine levied so far under the Act. I welcome this strong action to protect children online.

We have always been clear that while the Online Safety Act provides the foundations, there is more to do to ensure that children live enriching online lives. Like all regulatory regimes, it must remain agile. That is all the more critical given that we are dealing with fast-moving technology. That is why this Government have already taken a number of decisive steps to build on these protections.

The first act of my right hon. Friend the Secretary of State was to make online content that promotes self-harm and suicide a priority offence under the Online Safety Act. That means that platforms must take proactive steps to stop users seeing this content in the first place. If it does appear, platforms must minimise the time that it is online. As well as that, both intimate image abuse and cyber-flashing are now priority offences under the Online Safety Act.

Last month, my right hon. Friend the Secretary of State stood in this Chamber and made it clear that the creation of non-consensual deepfakes on X is shocking, despicable and abhorrent. She confirmed that we would expedite legislation to criminalise the creation of non-consensual intimate images, and I am pleased to confirm to the House that that came into effect earlier this month. That will also be designated as a priority offence under the Online Safety Act, and it complements the existing criminal offence of sharing or threatening to share a deepfake intimate image without consent.

Alongside that, it was announced that we will legislate to criminalise nudification tools to make it illegal for companies to supply tools to be used as generators of non-consensual intimate images. Last week, we went further still and announced that we will introduce a legal duty requiring tech companies to remove non-consensual intimate images within 48 hours of them being reported. These measures will provide real protection for women and girls online.

However, we recognise the strength of feeling up and down the country and right across this House—not least in this debate. We share the concern of many parents about the wider impact of social media and technology on children’s wellbeing. The rapid growth of grassroots campaigns such as Smartphone Free Childhood highlights how concerned parents are about the pull of these technologies and what it means for their children. That includes the potential impacts on mental health, sleep and self-esteem.

We have set out our commitment to supporting parents and children with these issues. We want to find solutions that genuinely support the wellbeing of our children and to give parents the help that they need as they guide children through online spaces safely.

--- Later in debate ---
Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

There were very real and important debates during the passage of that Bill about legal but harmful material and whether people should be able to speak freely online. Our approach was to seek to create a space where adults can speak freely while accepting that children should not be in some of these spaces. That was the point that the Leader of the Opposition was trying to make.

We were moving very dangerously into the realms of free speech, and it is not for an online regulator to start telling people what they can and cannot say online when it is not something that is illegal to speak of in the real world. That was the challenge that we got ourselves into as a Government, and that is why we changed parts of the approach that we were taking to the Online Safety Bill. I appreciate the concerns that are being raised, and I am trying to answer them as honestly and straightforwardly as I can.

When we consider the amendment from Lord Nash, this House will have its opportunity to make an unequivocal statement of principle: that when we believe that something is harming children at scale, we accept that it is insufficient to leave the status quo unchallenged or simply to commission a consultation. That applies especially when it is a consultation to which this Government have provided absolutely no political direction or view and that has been much trailed but still not actually launched. In truth, this consultation was not ready. It was a mechanism to get the Prime Minister out of another of his tight fixes.

The Tech Secretary might be very good at emoting and telling us all how impatient she is for change, how she cares, and indeed for how many years she has cared, but when she made her statement on social media for children in this Chamber a few weeks ago, she said nothing about what the Government would actually do, beyond seeking more time to take a position. I commend the hon. Member for Twickenham for pointing that out, and I have sympathy with why she is trying to use this mechanism today, because we are all trying to tease out what the Government are seeking to do.

It was extraordinary to listen to the Government Minister, who said with great sincerity, “We will act robustly in responding to a consultation.” What does he actually believe? What do the Government think we should do on this issue? Nobody has a clue. They are talking about a huge range of things that could be done, but it is for a Government to provide political direction; it is not for a Government to seek consensus. [Interruption.] It is for a Government to take a position and to take a view. It is for a Government to have opinions. It is for a Government to have policy positions. It is not for a Government to try to make sure that everybody in this House agrees. [Interruption.] It is pathetic to see those on the Labour Benches getting out of their tree about this.

Natasha Irons Portrait Natasha Irons
- Hansard - -

I sincerely thank the hon. Lady for giving way. When we talk about the consultation, it is not necessarily about seeking consensus in this place; it is about seeking consensus with parents and children, and with people outside this place. Banning social media for children is a good approach, but this is not just about that, is it? It is also about the time that our kids are spending on screens. That is what this is about: it is about having a digital childhood that we can all get behind and support.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I can agree with that. My point is that this Government are trying to suggest that a consensus can be found in the absence of their having a policy position. They are talking about a consultation, but what on earth are they consulting on? Nobody has a clue. They have not been able to say anything about what they actually want to do, because the Prime Minister has no opinions, which is why he is in such deep trouble. Those on the Labour Benches can get out of their tree and get all uppity about it, but this—[Interruption.] No, the Prime Minister is being blown around like a paper bag on this issue, and everybody knows it. First of all, he said that his children did not want to ban social media; now he says that his children are the reason why he wishes to ban social media. He said there is going to be a consultation, but it has not materialised. What does this man actually think?

--- Later in debate ---
Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I agree with the hon. Member wholeheartedly.

Until now, we have implicitly decided that childhood must simply adapt to an environment that we as adults find totally overwhelming, undermining of our own sense of self and completely irresistible. We have been exposing our children to this place of no settled social rules where that exposure is constant, the boundaries are porous and responsibility is diffuse. Behaviour that would never be tolerated offline is normalised, monetised and then algorithmically amplified. The Online Safety Act, which we have discussed already, has been a step forward in trying to wrest back control, but it is, of course, an imperfect one. It focuses primarily on illegal content, seeks to keep the most extreme material offline and introduces age-gating for pornography and other over-18 content. That work does matter, but the problem before us today goes well beyond illegality and explicit material. There are also many concerns about the complexity of policing content, in terms of both the implementation and intent.

The central question is not just what children see but how social media works. Social media platforms are addictive by design. Their algorithms are engineered to maximise engagement and stickiness. They reward outrage, comparison, emotional intensity, competition and repetition. They draw children away from purposeful activity and into feedback loops that erode attention and resilience. Not all platforms operate like this globally, funnily enough. The Chinese version of TikTok is time-limited and feeds children content of scientific or patriotic value. In the west, it is emotional arousal that is fed to our kids.

Children are not simply consuming content; they are being shaped by the environment itself. It is happening when their brains are still developing. Their impulse control, emotional regulation and ability to assess risk are not the same as for adults. We recognise this everywhere else in law—in alcohol limits, in safeguarding rules and in age of consent protections—yet online we have decided to suspend that logic, and the consequences are increasingly visible.

Natasha Irons Portrait Natasha Irons
- Hansard - -

I am new to this place and clearly still learning, but I am wondering why, in that case, measures on designing out at source the harms that the hon. Member is talking about were watered down in the Online Safety Bill. She is absolutely right: we are creating online worlds, and they should be designed to be safe. Just as we design clothes for children that do not have toxic materials in them, we would hope that the spaces they inhabit online also do not have toxic material in them, so why were those protections not strengthened in the Bill that the Conservative party passed when it was in power?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I have set out before what we were trying to achieve with the Online Safety Act and why certain things were in it and others were not. I do not want to go over that again.

The consequences of these design features are increasingly visible, including rising anxiety and low mood, poor sleep, shredded attention spans and cyber-bullying that follows children home.

--- Later in debate ---
Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - - - Excerpts

I really thank the hon. Member for that intervention, because that is exactly one of the recommendations of the Committee’s inquiry. As he says, the advertisement-based business models of most social media companies mean that they promote addictive content regardless of authenticity. This spills out across the entire internet via the unclear, under-regulated digital advertising market, incentivising the creation of content that will perform well on social media, as we saw during the 2024 unrest following the horrendous Southport attacks.

This is not just a social media problem, though. It is a systemic issue that promotes harmful content and undermines public trust. The Committee identified five key principles that we believe are crucial for building public trust. The first is public safety. Public safety matters; I hope it is not necessary to debate that. The second is free and safe expression, which is also very important. The third is responsibility on the part of the platforms. Right now, they have no legal responsibility for the content they amplify; they just have to follow their own processes in certain specific cases. Our fourth principle involves control, and the fifth and final principle is transparency. We made detailed recommendations on regulating the advertising-based business model so that amplification would not be incentivised in the way that was outlined by the hon. Member for Carshalton and Wallington (Bobby Dean). We also recommended a right to reset—the right of a person to remove their data from any algorithm.

Our report came out not long before the Minister took up his position. The Government accepted all our conclusions but none of our recommendations. I urge them to look again at our recommendations and to consider implementing them, or at least to respond and tell me why they are still not to be implemented. I welcome the Government’s recent actions and interventions and their readiness to intervene. As I said, the consultation is critical. I welcome the desire to promote a consensus and to take measures to ensure swift delivery of the consultation conclusions through the Children’s Wellbeing and Schools Bill. The consideration of the inclusion of AI chatbots is important, as is addressing the risky features in certain models, as well as providing support for bereaved parents. The Committee looks forward to working with the Government to try to achieve their aims. We need evidence to drive policy and regulation based on principles that the public can have confidence in.

Natasha Irons Portrait Natasha Irons
- Hansard - -

I wanted to intervene on the point about principles, content and responsibility. I worked for Channel 4 before I came to this place, and we were regulated by Ofcom. Channel 4 did not create its own content, but was responsible for the editorialisation of that content. It was beholden to certain standards. Does she agree that we should be holding these media companies—they are not now “new media” companies, but legacy media companies—just as responsible for the content they put out on their platforms as any broadcaster?

Chi Onwurah Portrait Dame Chi Onwurah
- Hansard - - - Excerpts

My hon. Friend makes an important point; the insight she brings from her career in the media is critical. For many years, while the platforms were just that—platforms on which other people placed content—there was an argument that they should not be regulated and that they did not have a responsibility for the content on them, but they are at the very least active curators of that content now. Algorithms effectively form digital twins of individuals and then drive individualised content at them. That requires a responsibility. The time is right, as our Committee recommended, to ensure that platforms have responsibility for their content.

The Science, Innovation and Technology Committee will be holding a one-off session on social media age restrictions on 11 March to feed into the Government’s consultation on measures to keep children safe online and to hear from social media companies on their progress in the last year. We will also gauge the strength of the evidence for and against an age-based ban on social media, as well as any evidence relating to proposed alternatives to a ban. In doing so, we will hear from experts and representatives of those with direct experience of harms. We want to hear from both sides of the debate in the UK and will be seeking evidence from Australia on the first few months of the ban that is already in force there. We will be hearing from major social media and technology companies in a follow-up to our algorithms and misinformation inquiry, and we will ask for their views on the proposed age limits.

Finally, the work on social media age restrictions will feed into a larger inquiry on the neuroscience of digital childhood, which we will launch in the coming weeks. We want to find out how young people spending their formative years online affects their brains and what the Government should do to protect them from any negative impact. That could cover the impact of social media and other screentime on brain development, behaviour, and physical and mental health, whether positive or negative. It could also cover the physiological impact on eye development, the impact on socialisation and what actions Governments should take. There is a consensus on the need to do something, but not on what needs to be done. That is why we are seeking to provide evidence.

I always say to the platform companies that the opposite of regulation is not no regulation, but bad regulation. More regulation is coming. Several US states, such as California, have brought in new regulation on big tech. The Spanish Prime Minister has called social media a

“failed state where laws are ignored and crimes are tolerated”.

There is also the increasingly significant issue of technology sovereignty and whether we are too dependent on foreign companies for our online environment. I call myself a tech evangelist, and I am, but I also know how much an engineer costs. The starting salary of an AI engineer—if companies can find one—is well over £100,000 a year. Tech companies are not going to put them to work on protecting and keeping our children safe unless the House puts the right incentives in place. With all due respect to the Minister and the Online Safety Act, which he inherited, they are not in place now.