Online Safety: Children and Young People Debate
Full Debate: Read Full DebateJulie Minns
Main Page: Julie Minns (Labour - Carlisle)Department Debates - View all Julie Minns's debates with the Department for Science, Innovation & Technology
(1 day, 16 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to speak under your chairmanship, Mr Dowd. Some 20 years ago, I started a new job with an as yet unbranded mobile network operator. At the time, the network had no masts, no handsets and no customers. Text messaging was just catching on, the BlackBerry was in its infancy and wireless application protocol was the new kid on the block. For those who do not know what WAP was, it was a bit like having Ceefax on a handset; for those who do not know what Ceefax was, I cannot really help.
My counterparts and I at the four mobile networks were acutely aware that the introduction of 3G would change how we used our phones. I will, however, confess that understanding what that change would look like—all while using dial-up at home—was something of a stab in the dark. Nevertheless, no matter how challenging, we knew that the advent of 3G required the mobile industry to take greater responsibility to protect the safety of our customers, in particular those under the age of 18. The networks moved from walled garden internet, where access was controlled by age verification and personal identification number, to a world where internet was freely available.
The mobile networks published the first self-regulatory code of content on mobile. It was a world first, and something that UK mobile operators were rightly proud of, but the pace of change was rapid; within months, we networks published a further self-regulatory code to govern location-based services, which, as we have heard already, present a clear danger to young people. We knew then that location tracking could be used in grooming and other predatory behaviour. We published the code, but the pace of change over the past 20 years has been unrelenting, and we now arrive at a point at which almost everything we do happens online.
The role of the mobile network is no longer as a gatekeeper to services, but rather as a pipe to over-the-top services such as YouTube, WhatsApp and TikTok. Those services can be more readily controlled by both the service provider and the handset manufacturer. That is not to absolve the networks of responsibility, but to acknowledge that they operate in a mobile value chain. I might pay £25 a month to my mobile network, but if I renew my handset every two years at a cost of £800, I am paying far more to the handset manufacturer than to the mobile network operator. I believe there is a strong argument that those who derive the greatest financial value from that value chain bear far greater responsibility for keeping children and young people safe online than is currently the case.
I turn now to one specific aspect of online harm. Having worked closely with the Internet Watch Foundation during my time in industry, I am fully aware of—and I thank it for—its important work in assessing child sexual abuse image material and removing it from the internet. I have visited and met the IWF teams who have to view and assess some of the most upsetting content. Their work is harrowing and distressing, but, sadly, it is essential.
Last year, the IWF assessed more than 390,000 reports and confirmed more than 275,000 web pages containing images or videos of children suffering sexual abuse. Each page contained hundreds, if not thousands, of indecent images of children. The IWF reported that 2023 was the most extreme year on record, with more category A sexual abuse imagery discovered than ever before, 92% of it self-generated child abuse. That means that the children have been targeted, groomed and coerced into sexual activities via webcams and devices with cameras.
For the first time, the IWF also encountered and analysed more than 2,400 images of sexual abuse involving children aged three to six. Some 91% of those images were of girls, mainly in domestic settings such as their own bedrooms or bathrooms. Each image or video is not just a single act; every time it is viewed or downloaded is another time that that child is sexually abused.
That is why I conclude my remarks with a clear ask to both the online and offline media and broadcast channels of our country: please stop describing these images as “kiddie porn” and “child pornography”. I did a search of some online news channels before I came to this debate; that language is still prevalent, and it has to stop. These images are not pornography. They are evidence of a crime and evidence of abuse. They are not pictures or videos. They are depictions of gross assault, sadism and bestiality against children. They are obscene images involving penetrative sexual activity with teenagers, children and babies. If there is one thing we can agree on in this debate, it is that the media in this country must start describing child sexual abuse material for what it is. Language matters, and it is time the seriousness of the offence was reflected in the language that describes it.
I am going to have to introduce a formal time limit of three and a half minutes.