Online Safety Act 2023: Repeal

Debate between John Slinger and Emily Darlington
Monday 15th December 2025

(1 week, 3 days ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Pritchard. I want to add some actual data to our debate today. We are inundated, often online or in our inboxes, with messages about repealing the Online Safety Act. These are well-funded campaigns. There is also a lot of material online coming from very particular sources, not necessarily within the UK. Actually, 70% of people in the UK support the Online Safety Act and a similar number support age verification. Much of that has to do with what our children are seeing online. Almost 80% of people aged 18 to 21 have seen sexual violence before age 18. That is a huge number of people whose initial sexual experiences or viewing of sex involves violence.

What does the Online Safety Act do? It puts porn back on the top shelf—it does not get rid of it. We are all of an age to remember when porn was on the top of the magazine rack in the corner shop. Now it is being fed to our children in their feeds. The issue is also the type and nature of porn that people are seeing online: 80% of online porn has some kind of strangulation in it. That has real-world consequences, as we have seen from the latest data on women’s health in terms of strokes. Strangulation is now the second leading cause of strokes among women in the UK. That is shocking, and it is why we needed the Online Safety Act to intervene on what was being fed to us.

In Milton Keynes, 30% of young people have been approached by strangers since the implementation of the Online Safety Act. They are most frequently approached on Roblox. We do not automatically identify gaming platforms as places where people are approached by strangers, but we know from police investigations that they approach young children on Roblox and move them to end-to-end encryption sites where they can ask them to share images.

In 2024, there were 7,263 online grooming offences—remember that those will just be the ones that are not in end-to-end encryption sites. There were 291,273 reports of child sexual abuse material identified last year—again, remember, that is not the material being shared on end-to-end encryption sites, because we have no idea what is actually being shared on those. Some 90% of that material is self-generated—that is, groomers asking children to take pornographic pictures of themselves and share them. Once a picture is shared with a groomer, it goes into networks and can get shared anywhere in the UK or the world. The UK is the biggest consumer of child sexual abuse images. The police reckon that 850,000 people in the UK are consuming child sexual abuse images.

John Slinger Portrait John Slinger
- Hansard - -

I thank my hon. Friend for making an impassioned and powerful speech. Does she agree that outrage ought to be directed at us for not doing enough on these issues rather than for the way in which we have started to try to tackle them?

If the behaviours that my hon. Friend and other hon. Members have referred to happened in the real world—the so-called offline world—they would be clamped down on immediately and people would be arrested. Certain items cannot be published, be put in newsagents or be smuggled into school libraries and people could not get away with the defence, “This is a matter of my civil liberty.” We should be far more robust with online companies for the frankly shoddy way in which they are carrying out their activities, which is endangering our children and doing immense damage to our political system and wider life in our country and beyond.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I completely agree and I am going to come to that.

I recently met the NSPCC, the Internet Watch Foundation and the police forces that deal with this issue, and they told me that there are easy technological fixes when someone uploads something to a site with end-to-end encryption. For those who do not know, we use such sites all the time—our WhatsApp groups, and Facebook Messenger, are end-to-end encryption sites. We are not talking about scary sites that we have not heard of, or Telegram, which we hear might be a bit iffy; these are sites that we all use every single day. Those organisations told me that, before someone uploads something and it becomes encrypted, their image or message is screened. It is screened for bugs to ensure that they are not sharing viruses, but equally it could be screened for child sexual abuse images. That would stop children even sharing these images in the first place, and it would stop the images’ collection and sharing with other paedophiles.

My hon. Friend the Member for Rugby (John Slinger) is absolutely right: 63% of British parents want the Government to go further and faster, and 50% feel that our implementation has been too slow. That is not surprising; it took seven years to get this piece of legislation through, and the reality is that, by that time, half of it was out of date, because technology moves faster than Parliament.

--- Later in debate ---
Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

My hon. Friend raises two really important points. First, if we try to create legislation to address what companies do today, it will be out of date by the time that it passes through the two Houses. What we do must be done on the basis of principles, and I think a very good starting principle is that what is illegal offline should be illegal online. That is a pretty clear principle. Offline legislation has been robustly challenged over hundreds of years and got us to where we are with our freedom of speech, freedom of expression and freedom to congregate. All those things have been robustly tested by both Houses.

John Slinger Portrait John Slinger
- Hansard - -

On that critical point about the lack of equality between offline and online, does my hon. Friend agree that if I were to go out into the street and staple to somebody’s back an offensive but not illegal statement that was impermeable to being washed off and remained on their back for months, if not years, I would probably be subject to immediate arrest, yet online that happens routinely to our children—indeed, to anyone in society, including politicians? Is that not illustrative of the problem?

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I agree; my hon. Friend makes a very important point about the slander that happens online, the lack of basis in reality and the lack of ability to address it. If somebody posts something about someone else that is untrue, platforms will not take it down; they will say, “It doesn’t breach our terms and conditions.” Somebody could post that I am actually purple and have pink eyes. I would say, “I don’t want you to say that,” and the platform would say, “But there’s nothing offensive about it.” I would say, “But it’s not me.” The thing is that this is happening in much more offensive ways.

My hon. Friend the Member for Oldham West, Chadderton and Royton (Jim McMahon) made the point that what happens online is then repeated offline. We have even seen deaths when children try to replicate the challenges that they see being set online. With AI-generated material, those challenges often are not real. It is the equivalent of somebody trying to repeat magic tricks and dying as a result, which is quite worrying.

The Online Safety Act is not perfect; it needs to go further. The petitioner has made a really important point. The lack of proper definition around small but non-harmful sites versus small but harmful sites is very unclear, and it is really important that the Act provides some clarity on that.

We do not have enough protections for democracy. The Science, Innovation and Technology Committee, which I am a member of, produced a really important report on misinformation and how it led to the riots two summers ago. Misinformation was used as a rallying cry to create unrest across our country of a sort that we had not seen in a very long time. The response from the social media companies was variable; it went from kind of “meh” to really awful. The platforms say, “We don’t police our content. We’re just a platform.” That is naive in the extreme. Quite frankly, they are happy to make money off us, so they should also know that they have to protect us—their customers—just as any other company does, as my hon. Friend the Member for Oldham West, Chadderton and Royton said.

The radicalisation that is happening online is actually shifting the Overton window; we are seeing a more divided country. There is a fantastic book called “Man Up”—it is very academic, but it shows the rise of misogyny leading to the rise of every other form of extremism and how that links back to the online world. If this was all about Islam, this House would be outraged, but because it starts with misogyny, it goes down with a fizzle, and too often people in this House say, “This is all about free speech.” We know that misogyny is the first step on a ladder of radicalisation that leads people to violence—whether into violence against women or further into antisemitism, anti-Islam, anti-anybody who is not the same colour, or anti-anybody who is perceived not to be English from Norman times.

The algorithms provoke violent and shocking content, but they also shadow-ban really important content, such as information on women’s health. Platforms are happy to shadow-ban terms such as “endometriosis” and “tampon”—and God forbid that a tampon commercial should feature red liquid, rather than blue liquid. That content gets shadow-banned and is regularly taken down and taken out of the algorithms, yet the platforms say they can do nothing about people threatening to rape and harm. That is not true; they can, and they choose not to. The public agree that algorithms must be part of the solution; 78% of British parents want to see action on algorithms. My hon. Friends are right that the Online Safety Act and Ofcom could do that, yet they have not done so—they have yet to create transparency in algorithms, which was the Select Committee’s No. 1 recommendation.

[Sir John Hayes in the Chair]

Finally, I want to talk about a few other areas in which we need to move very quickly: deepfakes and AI nudifying apps. We have already seen an example of how deepfakes are being used in British democracy: a deepfake was made of the hon. Member for Mid Norfolk (George Freeman) saying that he is moving from the Conservatives to Reform. It is a very convincing three-minute video. Facebook still refuses to take it down because it does not breach its terms. This should be a warning to us all about how individuals, state actors and non-state actors can impact our local democracy by creating deepfakes of any one of us that we cannot get taken down.