Online Safety Act 2023: Repeal

John Slinger Excerpts
Monday 15th December 2025

(1 day, 22 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Tom Collins Portrait Tom Collins (Worcester) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Mr Pritchard.

At its birth, the internet was envisaged as a great advancement in a free society: decentralised, crowdsourced and open, it would share knowledge across humanity. As it grew, every one of us would own a platform and our voice. Of course, since then bandwidth has increased massively, which means that we now experience a rich variety of media. Storage and compute have increased by many orders of magnitude, which has created the power of big data, and generative capabilities have emerged quite recently, creating a whole new virtual world. Services no longer simply route us to what we were searching for but offer us personalised menus of rich media, some from human sources and some generated to entertain or meet demands.

We are now just starting to recognise the alarming trends that we are discussing today. Such rich media and content has become increasingly harmful. That compute, storage and big data power is being used to collect, predict and influence our most private values, preferences and behaviours. Generative AI is immersing us in a world of reconstituted news, custom facts and bots posing as people. It increasingly feels like a platform now owns every one of us and our voice.

Harms are dangerously impacting our young people. Research from the Centre for Countering Digital Hate illustrates some of the problems. On YouTube, the “Next Video” algorithm was found to be recommending eating disorder content to the account of a UK-based 13-year-old female. In just a few minutes, the account was exposed to material promoting anorexia and weight loss, and more than half the other recommended videos were for content on eating disorders or weight loss.

On TikTok, new teen accounts were found to have been recommended self-harm and eating disorder content within minutes of scrolling the “For You” feed. Suicide content appeared within two and a half minutes, and eating disorder content within eight. Accounts created with phrases such as “lose weight” received three times as many of these videos as standard teen accounts, and 12 times as many self-harm videos. Those are not isolated incidents, and they show the scale and speed at which harmful material can spiral into exponential immersion in worlds of danger for young people.

On X, formerly known as Twitter—a trigger warning for anybody who has been affected by the absolutely appalling Bondi beach Hanukkah attack—following the Manchester synagogue attack, violent antisemitic messages celebrating and calling for further violence were posted and left live for at least a week. ChatGPT has been shown to produce dangerous advice within minutes of account creation, including guidance on self-harm, restrictive diets and substance misuse.

I am grateful to hon. Friends for raising the topic of pornography. I had the immense privilege of being at an event with a room full of men who spoke openly and vulnerably about their experiences with pornography: how it affected their sex lives, their intimacy with their partners or wives, their dynamics of power and respect, and how it infused all their relationships in daily life. They said things such as, “We want to see it, but we don’t want to want to see it.” If adult men—it seems from this experience, at least, perhaps the majority of adult men—are finding it that hard to deal with, how can we begin to comprehend the impact it is having on our children who come across it accidentally?

This can all feel too big to deal with—too big to tackle. It feels immense and almost impossible to comprehend and address. Yet, to some, the Online Safety Act feels like a sledgehammer cracking a nut. I would say it is a sledgehammer cracking a deeply poisonous pill in a veritable chemistry lab of other psychoactive substances that the sledgehammer completely misses and will always be too slow and inaccurate to hit. We must keep it, but we must do better.

As an engineer, I am very aware that since the industrial revolution, when physical machines suddenly became immensely more powerful and complex, a whole world of not just regulations but technical standards has been built. It infuses our daily lives, and we can barely touch an object in this room that has not been built and verified to some sort of standard—a British, European or global ISO standard—for safety. We should be ready to reflect that model in the digital world. A product can be safe or unsafe. We can validate it to be safe, design it to be safe, and set criteria that let us prove it—we have shown that in our physical world since the industrial revolution. So how do we now begin to put away the big, blunt instrument of regulation when the problem seems so big and insurmountable?

John Slinger Portrait John Slinger (Rugby) (Lab)
- Hansard - -

Ofcom officials came before the Speaker’s Conference, of which I am a member, so I declare that interest. They spoke about section 100 of the Act, which gives Ofcom the power to request certain types of information on how, for example, the recommender systems work on the companies’ algorithms. Unfortunately, they said that could be “complicated and challenging to do”, but one thing they spoke about very convincingly was that they want to require—in fact, they can require—those companies to put information, particularly about the algorithms, in the public domain to help researchers. That could really help with the point my hon. Friend is making about creating regulations that improve safety for our population.

--- Later in debate ---
Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Pritchard. I want to add some actual data to our debate today. We are inundated, often online or in our inboxes, with messages about repealing the Online Safety Act. These are well-funded campaigns. There is also a lot of material online coming from very particular sources, not necessarily within the UK. Actually, 70% of people in the UK support the Online Safety Act and a similar number support age verification. Much of that has to do with what our children are seeing online. Almost 80% of people aged 18 to 21 have seen sexual violence before age 18. That is a huge number of people whose initial sexual experiences or viewing of sex involves violence.

What does the Online Safety Act do? It puts porn back on the top shelf—it does not get rid of it. We are all of an age to remember when porn was on the top of the magazine rack in the corner shop. Now it is being fed to our children in their feeds. The issue is also the type and nature of porn that people are seeing online: 80% of online porn has some kind of strangulation in it. That has real-world consequences, as we have seen from the latest data on women’s health in terms of strokes. Strangulation is now the second leading cause of strokes among women in the UK. That is shocking, and it is why we needed the Online Safety Act to intervene on what was being fed to us.

In Milton Keynes, 30% of young people have been approached by strangers since the implementation of the Online Safety Act. They are most frequently approached on Roblox. We do not automatically identify gaming platforms as places where people are approached by strangers, but we know from police investigations that they approach young children on Roblox and move them to end-to-end encryption sites where they can ask them to share images.

In 2024, there were 7,263 online grooming offences—remember that those will just be the ones that are not in end-to-end encryption sites. There were 291,273 reports of child sexual abuse material identified last year—again, remember, that is not the material being shared on end-to-end encryption sites, because we have no idea what is actually being shared on those. Some 90% of that material is self-generated—that is, groomers asking children to take pornographic pictures of themselves and share them. Once a picture is shared with a groomer, it goes into networks and can get shared anywhere in the UK or the world. The UK is the biggest consumer of child sexual abuse images. The police reckon that 850,000 people in the UK are consuming child sexual abuse images.

John Slinger Portrait John Slinger
- Hansard - -

I thank my hon. Friend for making an impassioned and powerful speech. Does she agree that outrage ought to be directed at us for not doing enough on these issues rather than for the way in which we have started to try to tackle them?

If the behaviours that my hon. Friend and other hon. Members have referred to happened in the real world—the so-called offline world—they would be clamped down on immediately and people would be arrested. Certain items cannot be published, be put in newsagents or be smuggled into school libraries and people could not get away with the defence, “This is a matter of my civil liberty.” We should be far more robust with online companies for the frankly shoddy way in which they are carrying out their activities, which is endangering our children and doing immense damage to our political system and wider life in our country and beyond.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I completely agree and I am going to come to that.

I recently met the NSPCC, the Internet Watch Foundation and the police forces that deal with this issue, and they told me that there are easy technological fixes when someone uploads something to a site with end-to-end encryption. For those who do not know, we use such sites all the time—our WhatsApp groups, and Facebook Messenger, are end-to-end encryption sites. We are not talking about scary sites that we have not heard of, or Telegram, which we hear might be a bit iffy; these are sites that we all use every single day. Those organisations told me that, before someone uploads something and it becomes encrypted, their image or message is screened. It is screened for bugs to ensure that they are not sharing viruses, but equally it could be screened for child sexual abuse images. That would stop children even sharing these images in the first place, and it would stop the images’ collection and sharing with other paedophiles.

My hon. Friend the Member for Rugby (John Slinger) is absolutely right: 63% of British parents want the Government to go further and faster, and 50% feel that our implementation has been too slow. That is not surprising; it took seven years to get this piece of legislation through, and the reality is that, by that time, half of it was out of date, because technology moves faster than Parliament.

--- Later in debate ---
Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

My hon. Friend raises two really important points. First, if we try to create legislation to address what companies do today, it will be out of date by the time that it passes through the two Houses. What we do must be done on the basis of principles, and I think a very good starting principle is that what is illegal offline should be illegal online. That is a pretty clear principle. Offline legislation has been robustly challenged over hundreds of years and got us to where we are with our freedom of speech, freedom of expression and freedom to congregate. All those things have been robustly tested by both Houses.

John Slinger Portrait John Slinger
- Hansard - -

On that critical point about the lack of equality between offline and online, does my hon. Friend agree that if I were to go out into the street and staple to somebody’s back an offensive but not illegal statement that was impermeable to being washed off and remained on their back for months, if not years, I would probably be subject to immediate arrest, yet online that happens routinely to our children—indeed, to anyone in society, including politicians? Is that not illustrative of the problem?

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I agree; my hon. Friend makes a very important point about the slander that happens online, the lack of basis in reality and the lack of ability to address it. If somebody posts something about someone else that is untrue, platforms will not take it down; they will say, “It doesn’t breach our terms and conditions.” Somebody could post that I am actually purple and have pink eyes. I would say, “I don’t want you to say that,” and the platform would say, “But there’s nothing offensive about it.” I would say, “But it’s not me.” The thing is that this is happening in much more offensive ways.

My hon. Friend the Member for Oldham West, Chadderton and Royton (Jim McMahon) made the point that what happens online is then repeated offline. We have even seen deaths when children try to replicate the challenges that they see being set online. With AI-generated material, those challenges often are not real. It is the equivalent of somebody trying to repeat magic tricks and dying as a result, which is quite worrying.

The Online Safety Act is not perfect; it needs to go further. The petitioner has made a really important point. The lack of proper definition around small but non-harmful sites versus small but harmful sites is very unclear, and it is really important that the Act provides some clarity on that.

We do not have enough protections for democracy. The Science, Innovation and Technology Committee, which I am a member of, produced a really important report on misinformation and how it led to the riots two summers ago. Misinformation was used as a rallying cry to create unrest across our country of a sort that we had not seen in a very long time. The response from the social media companies was variable; it went from kind of “meh” to really awful. The platforms say, “We don’t police our content. We’re just a platform.” That is naive in the extreme. Quite frankly, they are happy to make money off us, so they should also know that they have to protect us—their customers—just as any other company does, as my hon. Friend the Member for Oldham West, Chadderton and Royton said.

The radicalisation that is happening online is actually shifting the Overton window; we are seeing a more divided country. There is a fantastic book called “Man Up”—it is very academic, but it shows the rise of misogyny leading to the rise of every other form of extremism and how that links back to the online world. If this was all about Islam, this House would be outraged, but because it starts with misogyny, it goes down with a fizzle, and too often people in this House say, “This is all about free speech.” We know that misogyny is the first step on a ladder of radicalisation that leads people to violence—whether into violence against women or further into antisemitism, anti-Islam, anti-anybody who is not the same colour, or anti-anybody who is perceived not to be English from Norman times.

The algorithms provoke violent and shocking content, but they also shadow-ban really important content, such as information on women’s health. Platforms are happy to shadow-ban terms such as “endometriosis” and “tampon”—and God forbid that a tampon commercial should feature red liquid, rather than blue liquid. That content gets shadow-banned and is regularly taken down and taken out of the algorithms, yet the platforms say they can do nothing about people threatening to rape and harm. That is not true; they can, and they choose not to. The public agree that algorithms must be part of the solution; 78% of British parents want to see action on algorithms. My hon. Friends are right that the Online Safety Act and Ofcom could do that, yet they have not done so—they have yet to create transparency in algorithms, which was the Select Committee’s No. 1 recommendation.

[Sir John Hayes in the Chair]

Finally, I want to talk about a few other areas in which we need to move very quickly: deepfakes and AI nudifying apps. We have already seen an example of how deepfakes are being used in British democracy: a deepfake was made of the hon. Member for Mid Norfolk (George Freeman) saying that he is moving from the Conservatives to Reform. It is a very convincing three-minute video. Facebook still refuses to take it down because it does not breach its terms. This should be a warning to us all about how individuals, state actors and non-state actors can impact our local democracy by creating deepfakes of any one of us that we cannot get taken down.