Social Media Use: Minimum Age

Maya Ellis Excerpts
Monday 24th February 2025

(1 month, 1 week ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Tony Vaughan Portrait Tony Vaughan
- Hansard - - - Excerpts

I completely agree with my hon. Friend, and I congratulate the children in his constituency on taking a very sensible approach. It is interesting that children themselves are coming forward and saying that—perhaps because they see the harms that I am talking about and want to do something about them. We have seen cases where children as young as 14 have taken their own lives after being bullied or exposed to harmful online content. During preparation for this debate, I was informed by the National Society for the Prevention of Cruelty to Children that there is an online website posing as a community that encourages suicide. That is the dark and depressing side of the online world that we have to do more to tackle.

What about addiction? Social media platforms are designed to exploit vulnerabilities in our young people. Algorithms push harmful content—body image issues, self-harm or anxiety videos—directly on to their feeds. A recent survey showed that on TikTok the algorithm was 4,343% more likely to show toxic eating disorder content to users who were already vulnerable to such issues. Many Members will have seen the Channel 4 documentary “Swiped”, where a secondary school took the phones of year 8 pupils for 12 weeks to see what would happen. The results were impressive: children talked to their friends more, reported less anxiety and were more focused in class.

Maya Ellis Portrait Maya Ellis (Ribble Valley) (Lab)
- Hansard - -

Given that 70% of youth services investment has been slashed since 2010, does my hon. and learned Friend agree that we need to provide opportunities, aside from school, where children can interact before taking away one of the few places that they have to spend time with their peers?

Tony Vaughan Portrait Tony Vaughan
- Hansard - - - Excerpts

My hon. Friend raises a really important point. This cannot be about shutting down avenues for young people to socialise with each other. Whatever action is taken to make it harder for young people to access social media, we have to make sure that other things are going on in society so that they do not feel that that is the only place they can go to socialise.

The petitioners’ view, as I said, is that we should ban access to social media until children are 16. I spoke to the NSPCC before this debate; its position is that it does not think an outright ban is the answer. Without changing the software or the devices, a ban on children using social media—without doing more—would be unenforceable. The NSPCC’s view is that a ban would push children into unregulated and more dangerous online spaces.

Does the Online Safety Act do enough? Several people I spoke to in preparing for this debate think that it does. For example, there is a requirement for social media companies to conduct children’s access assessments to determine whether children are likely to access their platform. There are online age assurance measures that require social media companies to assess whether their services are likely to be accessed by children and to adopt robust methods such as photo ID matching, facial age estimation and mobile network checks.

Age assurance measures are of course right, but groups such as Smartphone Free Childhood do not believe that risk assessments, and the Online Safety Act more broadly, go far enough. They do not advocate for an approach of risk assessment and risk reduction methods; rather, they say that the onus should be on the social media companies to demonstrate that their apps are safe for children to use and that, if they cannot, their app must not be used by children. That seems to be the opposite of putting the onus on the regulator to prove that an app is dangerous or harmful. It might well be that that would be something the code of practice under the Online Safety Act could do. It would require tightening that code of practice, so it would be useful to know whether the Minister agrees that the Act would be capable of reversing that burden, and that we ought to think about those methods.