(1 week, 3 days ago)
Commons ChamberThe hon. Gentleman tempts me to broaden the debate, which I do not think you would encourage me to do at this late stage, Madam Deputy Speaker. However, he makes a very important point about self-regulation in this sector. The public, parents, and indeed children look to us to make sure we have their best interests at heart.
The Online Safety Act may only say that age minima should be enforced “consistently” rather than well, but I do not think the will of this Parliament was that it would be okay to enforce a minimum age limit consistently badly. What we meant was that if the law says right now that the age minimum is 13, or if it is 16 in the future—or whatever other age it might be—companies should take reasonable steps to enforce it. There is more checking than there used to be, but it is still very limited. The recent 5Rights report on Instagram’s teen accounts said that all its avatars were able to get into social media with only self-reported birth dates and no additional checks. That means that many thousands of children under the nominal age of 13 are on social media, and that there are many more thousands who are just over 13 but who the platform thinks are 15, 16 or 17, or perhaps 18 or 19. That, of course, affects the content that is served to them.
Either Ofcom or the ICO could tighten up the rules on the minimum age, but amendment 9 would require that to happen in order for companies to be compliant with the ICO regulation. The technology does exist, although it is harder to implement at the age 13 than at 18—of course, the recent Ofcom changes are all about those under the age of 18—but it is possible, and that technology will develop further. Ultimately, this is about backing parents who have a balance to strike: they want to make sure that their children are fully part of their friendship groups and can access all those opportunities, but also want to protect them from harm. Parents have a reasonable expectation that their children will be protected from wholly inappropriate content.
I rise to speak to new clauses 1 and 11, and briefly to new clause 2. The Liberal Democrats believe that the Government have missed a trick by not including in this Bill stronger provisions on children’s online safety. It is time for us to start treating the mental health issues arising from social media use and phone addiction as a public health crisis, and to act accordingly.
We know that children as young as nine and 10 are watching hardcore, violent pornography. By the time they are in their teens, it has become so normalised that they think violent sexual acts such as choking are normal—it certainly was not when we were teenagers. Girls are starving themselves to achieve an unrealistic body image because their reality is warped by airbrushed images, and kids who are struggling socially are sucked in by content promoting self-harm and even suicide. One constituent told me, “I set up a TikTok account as a 13-year-old to test the horrors, and half a day later had self-harm content dominating on the feed. I did not search for it; it found me. What kind of hell is this? It is time we gave our children back their childhood.”
New clause 1 would help to address the addictive nature of endless content that reels children in and keeps them hooked. It would raise the minimum age for social media data processing from 13 to 16 right now, meaning that social media companies would not be able to process children’s data for algorithmic purposes. They would still be able to access social media to connect with friends and access relevant services, which is important, but the new clause would retain exceptions for health and educational purposes, so that children who were seeking help could still find it.
We know that there is a correlation between greater social media use among young people since 2012 and worsening mental health outcomes. Teachers tell me regularly that children are struggling to concentrate and stay awake because of lack of sleep. Some are literally addicted to their phones, with 23% of 13-year-old girls in the UK displaying problematic social media use. The evidence is before us. It is time to act now—not in 18 months and not in a couple of years. The addictive nature of the algorithm is pernicious, and as legislators we can do something about it by agreeing to this new clause 1.
It is time to go further. This Bill does not do it, but it is time that we devised legislation to save the next generation of teenagers from the horrors of online harm. Ofcom’s new children’s code provides hope that someone ticking a box to say they are an adult will no longer be enough to allow access to adult sites. That is a good place to start; let us hope it works. If it does not, we need to take quick and robust action to move further with legislation.
Given the nature of the harms that exist online, I also support new clause 11 and strongly urge the Government to support it. No parent should have to go through the agony experienced by Ellen Roome. Losing a child is horrific enough, but being refused access to her son’s social media data to find out why he died was a second unacceptable agony. That must be changed, and all ISPs should be compelled to comply. New clause 11 would make that happen. I heard what the Minister said about coroners, but I strongly believe that legislation is needed, with a requirement to release data or provide access to their children’s account for any parent or guardian of someone under 18 who has died. There is, as far as I can see, no reason not to support this new clause.
Briefly, I echo calls from across the House to support new clause 2 in support of our creatives. Creativity is a uniquely human endeavour. Like others, I have been contacted by many creators who do not want their output stolen by AI companies without consent or permission. It is vital that AI companies comply with copyright legislation, which clearly has to be updated to meet the requirements of the brave new world of tech that we now live in.