Online Safety: Children and Young People Debate
Full Debate: Read Full DebatePeter Dowd
Main Page: Peter Dowd (Labour - Bootle)Department Debates - View all Peter Dowd's debates with the Department for Science, Innovation & Technology
(1 day, 14 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I thank my hon. Friend for allowing me to intervene again. In my previous role as head of public policy at the British Computer Society, the one thing that my colleagues and I talked about a lot was the lack of focus on education in the Online Safety Act. I commend the previous Government for passing that legislation, which was very brave. The Act has tried to do some wonderful things, but what is missing is that we have failed to empower a generation of young people to act safely online, to be able to take back the power and say, “No, I am not going to do that.” We have failed in that so far. How do we build that in for the future?
Order. I would like to bring to the attention of Members that we have had a huge number of interventions and we are 20 minutes into the debate. The Minister and Opposition spokesperson will get up at just after half past 3. It is a matter for the speaker whether she takes more interventions, but that does mean that the amount of time for those who have asked to speak will be significantly more restricted than I originally planned. That is just a housekeeping matter to be aware of. There is also an issue about the length of interventions: they are getting a bit long. On a matter of this importance, I do not want to restrict interventions and contributions, but I ask Members to please bear that in mind.
Okay, I will make progress. On the live location element, which I have discussed, I am not sure that there is any advantage in children using that, unless it is a specifically regulated live location app where the parents have given consent for their child.
I do not know whether chatting to strangers on games is suitable for children. Adding peers to a group and enjoying playing with them on games is fine, but there could be strangers from other countries, with no indication of their age. One child told me that he had found out, after about three weeks, that the person he had been playing with was a 50-year-old man on another continent. That man was probably mortified, as was the child, and they stopped playing together. Why are we leaving it up to them? That is such a high-risk strategy for those apps; we need to think about that.
It is down to Parliament to decide what is safe for our children, and to enforce it. Asking platforms to mark their own homework and police themselves will undoubtedly lead to more children seeing inappropriate, harmful content and sharing it with others. I would like the Government to strengthen the children’s codes, and consider changing the onus from reactive safety measures that make apps safe for children, when we suspect they are children, to proactively making apps or platforms safe for all children in the first place, and creating adult-only apps that require strong age verification, because adults can consent to giving their data.
A number of ways to protect children online are being debated, as I am sure we will hear this afternoon. I feel strongly that retrofitting apps once children have been exposed to harmful content or strangers, or have shared things they should not, is not the safest or most effective way to do this. A number of options around age verification are on the table, but I would like the Government to consider that being a child is tough and that children have a right to make mistakes. The issue is that those mistakes involve mass communications to peers and a permanent digital footprint, because someone has consented, aged 13, to give away their data.
We need to see whether any child can consent to give away their data, and therefore whether apps that identify their audience as children should be allowed to keep data at all. Should children be in chatrooms with strangers across the world? Should children be allowed to share their live location with strangers or people they have accepted as contacts? Should children be allowed to view unregulated livestreams or addictive-by-design content? Those questions have been raised not only by children themselves but by parents and national advocacy charities and leaders in this space. There is a consensus that we have to take action on this issue, so let us make the most of it.
Order. I remind Members that they should bob if they wish to be called in the debate.
I could talk for hours on this subject, Mr Dowd, but, do not worry, I will not. There are a number of things that I would like to say. Not many Members present sat through the majority of the Online Safety Bill Committee as it went through Parliament, but I was in every one of those meetings, listening to various views and debating online safety.
I will touch on one issue that the hon. Member for Darlington (Lola McEvoy) raised in her excellent and important speech. I agree with almost everything she said. Not many people in Parliament have her level of passion or knowledge about the subject, so I appreciate her bringing forward the debate.
On the issue of features, I totally agree with the hon. Member and I moved an amendment to that effect during the Bill’s progress. There should be restrictions on the features that children should be able to access. She was talking about safety by design, so that children do not have to see content that they cannot unsee, do not have to experience the issues that they cannot un-experience, cannot be contacted by external people who they do not know, and cannot livestream. We have seen an increase in the amount of self-generated child sexual abuse material and livestreaming is a massive proportion of that.
Yesterday, a local organisation in Aberdeen called CyberSafe Scotland launched a report on its work in 10 of our primary schools with 1,300 children aged between 10 and 12—primary school children, not secondary school children. Some 300 of those children wrote what is called a “name it”, where they named a problem that they had seen online. Last night, we were able to read some of the issues that they had raised. Pervasive misogyny is everywhere online, and it is normalised. It is not just in some of the videos that they see and it is not just about the Andrew Tates of this world—it is absolutely everywhere. A couple of years ago there was a trend in online videos of young men asking girls to behave like slaves, and that was all over the place.
Children are seeing a different online world from the one that we experience because they have different algorithms and have different things pushed at them. They are playing Roblox and Fortnite, but most of us are not playing those games. I am still concerned that the Online Safety Act does not adequately cover all of the online gaming world, which is where children are spending a significant proportion of their time online.
A huge amount more needs to be done to ensure that children are safe online. There is not enough in place about reviewing the online safety legislation, which Members on both sides of the House pushed for to ensure that the legislation is kept as up to date as possible. The online world changes very rapidly: the scams that were happening nine months ago are totally different from those happening today. I am still concerned that the Act focuses too much on the regulation of Facebook, for example, rather than the regulation of the online world that our children actually experience. CyberSafe Scotland intentionally centred the views and rights of young people in its work, which meant that the programmes that it delivered in schools were much more appropriate and children were much better able to listen and react to them.
The last thing that I will mention is Girlguiding and its girls’ attitude survey. It is published on an annual basis and shows a huge increase in the number of girls who feel unsafe. That is because of the online world they are experiencing. We have a huge amount of responsibility here, and I appreciate the hon. Member for Darlington bringing the debate forward today.
I will keep this to an informal four-minute limit. Regrettably, if Members speak beyond that, I will have to introduce a formal figure.
It is a pleasure to speak under your chairmanship, Mr Dowd. Some 20 years ago, I started a new job with an as yet unbranded mobile network operator. At the time, the network had no masts, no handsets and no customers. Text messaging was just catching on, the BlackBerry was in its infancy and wireless application protocol was the new kid on the block. For those who do not know what WAP was, it was a bit like having Ceefax on a handset; for those who do not know what Ceefax was, I cannot really help.
My counterparts and I at the four mobile networks were acutely aware that the introduction of 3G would change how we used our phones. I will, however, confess that understanding what that change would look like—all while using dial-up at home—was something of a stab in the dark. Nevertheless, no matter how challenging, we knew that the advent of 3G required the mobile industry to take greater responsibility to protect the safety of our customers, in particular those under the age of 18. The networks moved from walled garden internet, where access was controlled by age verification and personal identification number, to a world where internet was freely available.
The mobile networks published the first self-regulatory code of content on mobile. It was a world first, and something that UK mobile operators were rightly proud of, but the pace of change was rapid; within months, we networks published a further self-regulatory code to govern location-based services, which, as we have heard already, present a clear danger to young people. We knew then that location tracking could be used in grooming and other predatory behaviour. We published the code, but the pace of change over the past 20 years has been unrelenting, and we now arrive at a point at which almost everything we do happens online.
The role of the mobile network is no longer as a gatekeeper to services, but rather as a pipe to over-the-top services such as YouTube, WhatsApp and TikTok. Those services can be more readily controlled by both the service provider and the handset manufacturer. That is not to absolve the networks of responsibility, but to acknowledge that they operate in a mobile value chain. I might pay £25 a month to my mobile network, but if I renew my handset every two years at a cost of £800, I am paying far more to the handset manufacturer than to the mobile network operator. I believe there is a strong argument that those who derive the greatest financial value from that value chain bear far greater responsibility for keeping children and young people safe online than is currently the case.
I turn now to one specific aspect of online harm. Having worked closely with the Internet Watch Foundation during my time in industry, I am fully aware of—and I thank it for—its important work in assessing child sexual abuse image material and removing it from the internet. I have visited and met the IWF teams who have to view and assess some of the most upsetting content. Their work is harrowing and distressing, but, sadly, it is essential.
Last year, the IWF assessed more than 390,000 reports and confirmed more than 275,000 web pages containing images or videos of children suffering sexual abuse. Each page contained hundreds, if not thousands, of indecent images of children. The IWF reported that 2023 was the most extreme year on record, with more category A sexual abuse imagery discovered than ever before, 92% of it self-generated child abuse. That means that the children have been targeted, groomed and coerced into sexual activities via webcams and devices with cameras.
For the first time, the IWF also encountered and analysed more than 2,400 images of sexual abuse involving children aged three to six. Some 91% of those images were of girls, mainly in domestic settings such as their own bedrooms or bathrooms. Each image or video is not just a single act; every time it is viewed or downloaded is another time that that child is sexually abused.
That is why I conclude my remarks with a clear ask to both the online and offline media and broadcast channels of our country: please stop describing these images as “kiddie porn” and “child pornography”. I did a search of some online news channels before I came to this debate; that language is still prevalent, and it has to stop. These images are not pornography. They are evidence of a crime and evidence of abuse. They are not pictures or videos. They are depictions of gross assault, sadism and bestiality against children. They are obscene images involving penetrative sexual activity with teenagers, children and babies. If there is one thing we can agree on in this debate, it is that the media in this country must start describing child sexual abuse material for what it is. Language matters, and it is time the seriousness of the offence was reflected in the language that describes it.
I am going to have to introduce a formal time limit of three and a half minutes.
It is a pleasure to speak under your chairmanship, Mr Dowd. I congratulate the hon. Member for Darlington (Lola McEvoy) on bringing forward this important debate. The internet has undeniably introduced a valuable resource for learning that has transformed society, but technology has also brought with it significant risks that I believe we in this House have an urgent duty to address. Nobody knows that more acutely than all those parents who have tragically lost their children after online abuse, who are bravely represented today here in the Public Gallery by Ellen.
The statistics are sobering. Recent figures from Ofcom reveal that one in five children in the UK has experienced some form of online harm, including cyber-bullying, exposure to inappropriate content and exploitation. The NSPCC reports that more than 60% of young people have encountered online bullying, but I think the risk goes much further than that. We know that the average age at which a child first views pornography is estimated to be 12, with some evidence now suggesting it is as young as eight years old. Free and widely available pornography is often violent, degrading and extreme, and it has become the de facto sex education for young people.
The pornography crisis is radically undermining the healthy development of children and young people, and contributing to increasing levels of sexual inequality, dysfunction and violence. That reality represents how children’s lives are affected by those dangers, and as parliamentarians we have a duty to keep our children safe and free from harm—online as well as offline. Nine in 10 children are now on a mobile phone by the age of 11, and around a quarter of three-year-olds now have their own smartphone. I do not know about you, Mr Dowd, but I find that statistic particularly troubling.
I believe it is crucial to differentiate smartphone use from the broader digital environment. Smartphones, as we know, are engineered to be addictive, with notifications that stimulate the release of dopamine, the same chemical that is linked to pleasure. It is too easy for children to become trapped in a cycle of dependency and peer pressure, addicted to feeds and competing for likes on social media. Addiction is exactly what the tech companies want. Research from the Royal Society for Public Health shows that social media harms mental health—we all know that—particularly among young users. Around 70% of young people now report that social media increases their feelings of anxiety and depression.
The Children’s Commissioner, Rachel de Souza, believes that Ofcom’s children’s codes, which the hon. Member for Darlington talked about, are not strong enough and are written for the tech companies rather than for the children. She says that we need a code that protects our children from the “wild west” of social media. In South Devon I often hear from parents overwhelmed by the digital environment their children are navigating. They want to protect their children, but they feel ill equipped to manage those complexities. Hundreds of them have signed up to the smartphone-free pledge, and are pressuring schools to take part as well. We need to give them support, by backing what they want to do with legislation.
I believe we need a legislative framework that will restrict the addictive nature of smartphones, tighten age restrictions and restrict access to social media platforms for all children under 16. We have to protect them. Those measures are crucial for online child safety, and I believe there is a broad consensus in the House that big tech must be held accountable for the harm it perpetuates. We must abide—
It is an honour to serve under your chairmanship, Mr Dowd. It has also been a real honour to be part of this debate, and I have been scribbling away because so much genuine passion has been put into it. Do I have 10 minutes, Mr Dowd?
My cogs are turning—everyone in this debate wants to make a difference, and the time is now. That is the critical point. There is far too much illegal and harmful activity on social media and online, whether that is racist abuse, incitement to violence or the grooming of children—so much has been brought up.
Keeping children safe online is more difficult, but more important, than ever before. Several Members have mentioned that they spoke to their local parent groups and schools. I met children from The Grove school in Harpenden. One child said, “How old do you think I should be to have a smartphone?” And I said, “Well, how old would you like it to be?” He said, “Eleven.” I said, “Why?” He said, “Because that is when my brother got his.” It was really interesting that the teachers said, “We are discussing this as a school right now because the kids are asking those questions.” What also came through was the importance of listening to young people, because they are the ones who are crying out for change and saying that something is not right.
We have heard from many Members, including the hon. Member for Darlington (Lola McEvoy), who set up the debate in a way that that none of us could follow, speaking with passion about the people behind this—the parents and the families. That is what we are all here for. We heard from the hon. Member for Rother Valley (Jake Richards) about how covid exacerbated problems, which highlighted the importance of discussing this issue now. The hon. Member for Gosport (Dame Caroline Dinenage) talked about Ian Russell and Molly; I think most of us are aware of that story. Ian has come to Parliament many times to talk about the impact, and we must never forget his family and so many more behind them. The hon. Member for Whitehaven and Workington (Josh MacAlister) spoke of the parallels between this issue and road safety, reminding us that we have to act now because, if we do not, we will look back and realise that we were doing a disservice to so many. We have to keep up on safety.
So much of this debate has been about identifying the issues with online safety, such as what the algorithms are sending us, location and chat features, the content and so much more. The hon. Member for Aberdeen North (Kirsty Blackman) talked about self-generated explicit content and the pervasive misogyny that so many have mentioned. The hon. Member for Carlisle (Ms Minns) mentioned young pornography being a crime and that we need to get the language right. That is key. Sexual inequality and violence are pervasive because of that content.
The hon. Member for Whitehaven and Workington spoke about the addictiveness of phones, and the hon. Member for Lowestoft (Jess Asato) highlighted the fact that mobile phone use is impacting short-sightedness. The hon. Member for Whitehaven and Workington mentioned sleep and asked what we are doing about the 21 hours a week spent on phones. So much of this is about what I call “digital mental health”, which refers to what is happening as a whole, beyond the algorithm and the impact of the content. The hon. Member for Strangford (Jim Shannon) mentioned self-harm, and I will certainly keep in mind the term “generational rewiring”, which the hon. Member for Whitehaven and Workington used.
When it comes to legislation, we have not acted fast enough and we have not gone far enough. As has been said, we need to move towards safety by design, but we also need legislation that is reactive and agile enough to keep up with the change. As Liberal Democrats, we were proud to push for the Online Safety Act to go further, and we successfully called for it to include online fraud and scams, as well as to outlaw cyber-flashing.
The hon. Member for Aberdeen North talked about online games, and the fact that we need to stay up to date. The hon. Member for Gosport mentioned holding Ofcom to account. The hon. Member for Stafford (Leigh Ingham) talked about grooming laws, and how we need blunt and sharp elements in the instruments that we use. The right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) reminded us that behind all this, we must get the technicalities right in the Online Safety Act, highlighting that this is not just about the content, but about keeping up with the speed and agility of the change.
As a Liberal Democrat, I would like to highlight what we are calling for. The importance of being proactive has been mentioned many times, and that means calling for safety by design. We are also calling for an independent advocacy body for children’s safety online. We would like to create a new online crime agency to effectively tackle illegal content and online activity, such as revenge porn, threats and incitement to violence on social media. We would also like to include a digital Bill of Rights to protect everyone’s rights online. That includes balancing the rights to privacy, freedom of expression and participation. The regulation of social media must respect the rights and privacy of those who use it legally and responsibly, but should not have a laissez-faire approach.
Another important element is education. The hon. Member for Darlington said that we cannot tackle all of this content. We cannot get all of this right, but it is important that we also empower young people and parents to be able to say what is right and wrong, and to help them to feel empowered to make a change, whether that is by using tools, or by speaking up and saying, “Actually, this is not right.” We should make sure that they feel they have that voice.
My hon. Friend the Member for South Devon (Caroline Voaden) mentioned that big tech needs to be held accountable—absolutely. We have to make sure that those who are building the platforms are the ones who ensure their safety by design, and that they keep up with that.
I close with a reminder that keeping young people safe online is more difficult, but more important, than ever before. We must act sooner rather than later and use all the tools at our disposal, whether that is through Ofcom and regulatory changes, by incentivising companies or by educating parents and children. Frankly, from the debate I have heard today, I have hope that if we work together, we can make sure that those changes are enacted swiftly and are kept up to date.
I am short of time, so I will have to proceed.
For example, Ofcom recently announced plans to launch a further consultation on the illegal content duties once the first iteration of those duties is set out in spring next year. That iterative approach enables Ofcom to prioritise getting its initial codes in place as soon as possible while it builds on the foundations set out in that first set of codes.
My hon. Friends the Members for Slough (Mr Dhesi) and for Lowestoft (Jess Asato) and the hon. Member for Aberdeen North raised the issue of violence against girls and women. In line with our safer streets mission, platforms will have new duties to create safer spaces for women and girls. It is a priority of the Online Safety Act for platforms proactively to tackle the most harmful illegal content, which includes offences such as harassment, sexual exploitation, extreme pornography, internet image abuse, stalking and controlling or coercive behaviour, much of which disproportionately affects women and girls. All services in scope of the Act need to understand the risks facing women and girls from illegal content online and take action to mitigate that.
My hon. Friend the Member for Carlisle (Ms Minns) set out powerfully the issues around child sexual exploitation and abuse. Child sexual abuse is a vile crime that inflicts long-lasting trauma on victims. UK law is crystal clear: the creation, possession and distribution of child sexual abuse images is illegal. The strongest protections in the Online Safety Act are against child sexual abuse and exploitation. Ofcom will have strong powers to direct online platforms and messaging and search services to combat that kind of abuse. It will be able to require platforms to use accredited, proactive technology to tackle CSEA and will have powers to hold senior managers criminally liable if they fail to protect children.
I am running short of time, so I shall make some final remarks. While we remain resolute in our commitment to implementing the Online Safety Act as quickly and effectively as possible, we recognise the importance of these ongoing conversations, and I am grateful to everyone who has contributed to today’s debate. I am grateful to the brave parents who continue to fight for protections for children online and shine a light on these important issues. The Opposition spokesperson, the hon. Member for Runnymede and Weybridge (Dr Spencer), asked a host of questions. I will respond to him in writing, because I do not have time to do so today, and I will place a copy in the Library.