Social Media Use: Minimum Age

James Frith Excerpts
Monday 24th February 2025

(1 day, 16 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Lola McEvoy Portrait Lola McEvoy (Darlington) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Stringer. I thank my hon. and learned Friend the Member for Folkestone and Hythe (Tony Vaughan) and the petitioner for bringing this vital debate.

Since being elected to this place in July, I have spoken about children’s safety online several times, including in this Chamber. For too long, our children’s development and protecting them from harm from predators, inappropriate and disturbing content and from each other, have been treated as an afterthought. As legislators, it falls to us to protect our children, but we are way behind where we need to be.

In my constituency of Darlington, this issue came up time and again on the campaign trail as parents, siblings and grandparents all reported feeling ill equipped to fulfil their most important role of giving their children a safe and healthy childhood. It is vital we understand that parents are asking for our support now, because for many of them, the fight and pressure from their own children to allow them the latest phone, more screen time, or access to an adult version of a game and much more, feels relentless.

This debate is about social media, but it is also about the digital age of consent. My view is that children under 16 should not be given the responsibility to permit or to deny companies’ access to their data. The risks are too high, and the long-awaited children’s codes from Ofcom are not yet in place; we do not know what impact the measures in the Online Safety Act will have on children’s behaviour and experience online. We should, therefore, stipulate that 16 is the age of digital consent.

Last week I visited Firthmoor primary school in Darlington for an assembly on online safety. It was exceptional; the children had songs, raps, roleplay and helpful tips for staying safe online. These children, aged between four and 11, are online already. I was struck by their understanding of passive screen time versus active screen time. Passive screen time includes scrolling aimlessly through suggested content, and active screen time is about learning. These children are trying to protect themselves, but it cannot just be left to them. I do not think that we can ban children from accessing screens, but we must safeguard them from harm until they are old enough to navigate the risks themselves.

James Frith Portrait Mr James Frith (Bury North) (Lab)
- Hansard - -

My wife and I regret ever getting a smartphone for our two eldest children. We have four, and we are wondering what to do when the third expects access to the same rights. Smartphone management is something we continually get wrong. My hon. Friend has talked about screen time. It cannot be beyond the wit of our smartphone creators to give parental controls better intuitive use, so that they cannot be undermined so easily by the smart children using the smartphones. Does she agree that while we need to strengthen the role of Ofcom in rooting out the toxic content that our children are pushed towards, the smartphone manufacturers also have a job to empower parents? It is a real concern, because children’s use of smartphones and their access to social media is a daily battle for their parents.

Lola McEvoy Portrait Lola McEvoy
- Hansard - - - Excerpts

I fully support what my hon. Friend says. Lots of parents in Darlington have said that although the default setting may be that children cannot access chat rooms on games or a more violent version of a game—because it is not just the phones and devices, but what they are accessing on those devices that really matters—they just lose the battle. When it comes to the crunch and their child is arguing that they want to go on the device and they are going to have a tantrum, they just allow them to go on it. Parents need more support from us as legislators, which is basically my point.

Children should be able to enjoy games and access safe and engaging educational content. Platforms should not be allowed to target them with suggested content. That is where the problems are coming in—with suggested content, children are exposed to harmful and unhealthy things. Platforms should have children-safe search engines, and features including live location and chat rooms should be designed to be transparent and child-friendly, with their safety at their heart. Accessing certain social media features, such as chatting with adults who they do not know or sharing content, should be solely for those who have been strictly age-verified as over 16.

--- Later in debate ---
James Frith Portrait Mr Frith
- Hansard - -

Will the Minister give way?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Yes, of course; my hon. Friend is always intervening on me.

James Frith Portrait Mr Frith
- Hansard - -

I thank the Minister; as ever, he has been very generous and is making excellent remarks. Away from the emergency—the toxicity and the worst aspects of this—the mundane sapping of hour after hour after hour is just as dangerous when we consider social media use and our ineffective guardrails for smartphone use. Yes, we all agree that the content the Minister has described should be done away with and prevented, but what is his reflection on the mundane drip and sapping away of the energy and attention of our young people and the doomscrolling ethos that has developed in their expectation of their everyday lives?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I do not want to be a hypocrite; this 63-year-old engages in all those things as well. In fact, it is a shocking shame for me every time I get that notification that says, “You spent on average x number of hours a day on your mobile phone.” I can make justifications—I have to find out what an hon. Member’s seat is, I have to send things back to my private office on WhatsApp and all of those kind of things—but the truth is that if somebody had said to us 40 years ago that they were going to invent something that would make us all, in an addictive way, spend hours and hours and hours looking at a phone rather than engaging with other human beings, we would have said, “Maybe not, eh?”

I was really struck by that when I went to a primary school in Blaengarw in my patch. The headteacher was saying that one of the difficulties is that all the parents waiting to pick up their kids were on their mobile phones outside, as the hon. Member for Mid Sussex (Alison Bennett) mentioned earlier. Whatever they did inside the school, the message that every single child got was that life was about being on a mobile phone. As has been said, one of the most important things that a parent can do is engage eye to eye with their children. If they are engaging eye to eye only with their phone, I would argue that that is as much of a problem. I will come on to some of the issues, but I do not want to be hypocritical about it.

I think we all accept that we have to do more. One thing that was not included in the list of things that someone might do if they did not have a mobile phone to spend all their time on was reading a book. I would love more young people to read a book. That longer attention span is one of the things that is an admirable part of being an adult human being.

Several hon. Members referred to the fact that legislation needs to keep up. I will put this very gently to Conservative Members: we argued for an online safety Act for a long time before one ended up becoming legislation. It went through a draft process, and there were lots of rows about what should and should not be in it, and whether we were impinging on freedom of speech and all those kinds of things, but the legislation did not end up on the statute books until the end of 2023. Even then, the Act provided for a fairly slow process of implementation thereafter, partly because Ofcom was taking on powers that, on that day, it simply would not have had enough staff to engage with. The process has been difficult, and I am absolutely certain that the Online Safety Act will not be the end of this story. That is why the Secretary of State for Science, Innovation and Technology has said clearly that everything is “on the table”, and that is why today’s debate is so important.

Of course, legislation has to be proportionate, balanced, based on evidence—I will come to that in more detail in a moment—and effective. That is why the Online Safety Act will require all platforms that are in scope, including social media platforms, to set up robust systems and processes to tackle the most egregious illegal content or activity proactively, preventing users from encountering it in the first place. Platforms will be required to remove all other illegal content as soon as it is flagged to them.

The Act will also require platforms easily accessed by children—this goes to a point made by several people—to deploy measures to protect children from seeing content that is harmful to them. That includes the use of highly effective age assurance to prevent them from seeing the most harmful types of content, such as that which promotes, encourages or provides instructions for self-harm, suicide or eating disorders. Platforms will also be required to provide age-appropriate access for other types of harmful content, such as bullying, abusive content or content that encourages dangerous stunts or serious violence.

Additionally, under the Act, providers that specify a minimum age limit to access their site must specify how they enforce that in their terms of service and must do so consistently. As many Members have said, this spring will be a key moment in the implementation of the Act, and that is an important point for us to recognise: later this year, things will change, because of the implementation of the Online Safety Act. Ofcom has already set out its draft child safety codes of practice, which are the measures that companies must take to fulfil their duties under the Act.

Ofcom’s draft codes outline that all in-scope services, including social media sites, will be required to tackle algorithms that amplify harm and feed harmful material to children. I would argue that that includes the process of trying to make something addictive for a child. Services will have to configure their algorithms to filter out the most harmful types of content from children’s feeds, and to reduce the visibility and prominence of other harmful content. In January, Ofcom published its guidance for services to implement highly effective age assurance to meet their duties, including the types of technology capable of being highly effective at correctly determining whether a user is a child.