Social Media Use: Minimum Age

Sarah Russell Excerpts
Monday 24th February 2025

(1 day, 19 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Sarah Russell Portrait Mrs Sarah Russell (Congleton) (Lab)
- Hansard - -

It is an honour to serve under your chairmanship, Mr Stringer. I did a lot of research in preparation for my speech today and, as a parent of three primary-age children, what I found really alarmed me. The National Society for the Prevention of Cruelty to Children reports that there were more than 7,000 offences of sexual communication with children last year, which was a significant increase on the year before. It says that typically in those offences, the perpetrators start to talk to children on fairly mainstream web services, and then encourage them to communicate instead on more private messaging services such as Snapchat, WhatsApp and Instagram. I was pretty shocked. I did not appreciate that this was such a widespread problem. We all know that if there were 7,000 offences reported to the police, a considerably larger number will have happened. I also discovered the prevalence of dating app use among children. Children experience terrible offences when they go to meet people who were, in fact, adults preying on them.

Fundamentally, we need to understand that when we talk about social media, children are a product. If anything that we use on the internet does not cost any money, the gain for the provider is access to our thoughts, feelings and communications—in this case, our children’s thoughts, feelings and communications with their friends. We have a generation now for whose entire lifespan those thoughts, feelings and communications with friends can be monetised and tracked across multiple different websites or social media apps. The complex picture that those companies have of our children is incredibly sophisticated, and their ability to target content at them is like nothing we have ever even imagined.

There is also a problem with parents inadvertently facilitating some of this stuff. I would count myself within that description to some extent, so it is certainly not judgmental. When a parent naively says that when a child is 13, they can access something that they would broadly consider uncontroversial—such as WhatsApp so they can chat to their friends—that creates an ageing risk throughout the lifespan of that app use. As was mentioned previously, children subsequently appear to be 16 or 18 before they actually are, and therefore obtain access to services that are unsafe for them much younger than they otherwise would have done. The parents do not appreciate the ageing risk that they are creating, potentially several years down the line.

The NSPCC says that we have a fundamental problem. We now have the Online Safety Act, introduced by the Conservatives, and we are working hard as a Government to bring it into force. Ofcom has been given a significant role in looking at child risk assessment by online providers. We all know that if those people had children’s best interests at heart, they would already have done a lot of the things that Ofcom requires. The fact that Ofcom is having to do an investigation into OnlyFans, and its ability or willingness to prevent under-age children from seeing sexualised content, does not sit comfortably—that is the minimum I will say about it.

[Martin Vickers in the Chair]

If I am honest, I am not quite sure what the right solution is to those problems. If we do not get societal consensus on the right solution, we will, for instance, carry on seeing parents helping children to circumnavigate age restrictions, and children using VPNs to circumnavigate them themselves. Plenty of teenagers are sophisticated enough to do that. I am not sure what the right answer is. I am not sure that preventing under-16s from accessing such content will solve it. There is a risk that it will create a false sense of security and enable providers of the facilities and apps to say, “Well, under-16s can’t use it. We don’t have to put any safety features in because children are not allowed it anyway.” They will completely abdicate responsibility.

It is important that we keep talking about these issues, and that we move forward on a cross-party basis. These are sophisticated problems and I am not sure whether we have a sufficiently sophisticated response to them. The Online Safety Act provides us with a lot of tools, and I can see that its potential fines of 10% of global revenue are quite high. That has the potential to drive some behaviour change, provided the companies involved really see that the tools have teeth. I hope that we will monitor very heavily how Ofcom gets on with the new legislation; I am sure that Members of all parties will be interested in that.

My hon. and learned Friend the Member for Folkestone and Hythe (Tony Vaughan) said that he spoke to his children before the debate to tell them that he was going to raise these issues. I did so with my children over breakfast this morning, and one of them berated me for not having been in her online safety assembly. We have to be realistic about the capacity of both parents and schools to manage these issues without making it a blame game between different organisations—parents versus schools versus major corporations. These corporations have a huge vested interest in exploiting our children, and we have to figure out how better to protect them.