Social Media Use: Minimum Age Debate
Full Debate: Read Full DebateLola McEvoy
Main Page: Lola McEvoy (Labour - Darlington)Department Debates - View all Lola McEvoy's debates with the Department for Science, Innovation & Technology
(1 day, 16 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Mr Stringer. I thank my hon. and learned Friend the Member for Folkestone and Hythe (Tony Vaughan) and the petitioner for bringing this vital debate.
Since being elected to this place in July, I have spoken about children’s safety online several times, including in this Chamber. For too long, our children’s development and protecting them from harm from predators, inappropriate and disturbing content and from each other, have been treated as an afterthought. As legislators, it falls to us to protect our children, but we are way behind where we need to be.
In my constituency of Darlington, this issue came up time and again on the campaign trail as parents, siblings and grandparents all reported feeling ill equipped to fulfil their most important role of giving their children a safe and healthy childhood. It is vital we understand that parents are asking for our support now, because for many of them, the fight and pressure from their own children to allow them the latest phone, more screen time, or access to an adult version of a game and much more, feels relentless.
This debate is about social media, but it is also about the digital age of consent. My view is that children under 16 should not be given the responsibility to permit or to deny companies’ access to their data. The risks are too high, and the long-awaited children’s codes from Ofcom are not yet in place; we do not know what impact the measures in the Online Safety Act will have on children’s behaviour and experience online. We should, therefore, stipulate that 16 is the age of digital consent.
Last week I visited Firthmoor primary school in Darlington for an assembly on online safety. It was exceptional; the children had songs, raps, roleplay and helpful tips for staying safe online. These children, aged between four and 11, are online already. I was struck by their understanding of passive screen time versus active screen time. Passive screen time includes scrolling aimlessly through suggested content, and active screen time is about learning. These children are trying to protect themselves, but it cannot just be left to them. I do not think that we can ban children from accessing screens, but we must safeguard them from harm until they are old enough to navigate the risks themselves.
My wife and I regret ever getting a smartphone for our two eldest children. We have four, and we are wondering what to do when the third expects access to the same rights. Smartphone management is something we continually get wrong. My hon. Friend has talked about screen time. It cannot be beyond the wit of our smartphone creators to give parental controls better intuitive use, so that they cannot be undermined so easily by the smart children using the smartphones. Does she agree that while we need to strengthen the role of Ofcom in rooting out the toxic content that our children are pushed towards, the smartphone manufacturers also have a job to empower parents? It is a real concern, because children’s use of smartphones and their access to social media is a daily battle for their parents.
I fully support what my hon. Friend says. Lots of parents in Darlington have said that although the default setting may be that children cannot access chat rooms on games or a more violent version of a game—because it is not just the phones and devices, but what they are accessing on those devices that really matters—they just lose the battle. When it comes to the crunch and their child is arguing that they want to go on the device and they are going to have a tantrum, they just allow them to go on it. Parents need more support from us as legislators, which is basically my point.
Children should be able to enjoy games and access safe and engaging educational content. Platforms should not be allowed to target them with suggested content. That is where the problems are coming in—with suggested content, children are exposed to harmful and unhealthy things. Platforms should have children-safe search engines, and features including live location and chat rooms should be designed to be transparent and child-friendly, with their safety at their heart. Accessing certain social media features, such as chatting with adults who they do not know or sharing content, should be solely for those who have been strictly age-verified as over 16.
I thank the hon. Lady for her work with my constituent, Ellen Roome, on issues to do with children and social media. As a Liberal, I am instinctively against banning things. However, liberal society has long tolerated minimum age limits for things that might be dangerous for children, such as cigarettes, alcohol or driving. Does she agree that we should consider social media use in the same light?
That is absolutely right; I am grateful to the hon. Member for his intervention. It is important that we strike the right balance. For a long time, we have been behind on protecting children online. It is time now to use the Online Safety Act 2023 and the upcoming children’s codes to get it right the first time. We do not know how they will bed in, and it is crucial that we get it absolutely right with the first iterations of the children’s codes in April.
To be able to chat with strangers or have content suggested to them, a person should be age-verified as over 16. For me, the online world is a hugely valuable part of modern life. As with everything we do offline, we must ensure that it is safe and regulated for children to use, and if it is not, we should not let them use it.
In Darlington, I have set up an online safety forum with year 10s across every secondary school in the town. Their biggest concern is the disturbing content that the Online Safety Act and children’s codes should protect against, but they have also flagged to me awful, horrible examples of peer-to-peer bullying, which is totally acceptable on social media platforms and goes unchecked. Ofcom is required to issue new codes every three years, so if the first codes do not get it right in April, we could be waiting for three years for a 13-year-old to be protected properly, by which point they will be 16 anyway.
The age to use social media in its current form, where platforms can suggest content and children can chat unchecked with strangers, should clearly be 16. Those whose age is not verified should be able only to access child-safe, limited platforms designed for children. That is common sense. I am concerned that without further legislation, platforms will be left to implement their own safeguards. In some cases, those may well be good, but our job is not to leave the protection of children online to chance. We should stipulate an age, require ID and be bold leaders in this space. Our children will look back and ask us what we were waiting for.