Online Safety Act 2023: Repeal Debate
Full Debate: Read Full DebateLizzi Collinge
Main Page: Lizzi Collinge (Labour - Morecambe and Lunesdale)Department Debates - View all Lizzi Collinge's debates with the Department for Digital, Culture, Media & Sport
(1 day, 22 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Lizzi Collinge (Morecambe and Lunesdale) (Lab)
It is a pleasure to serve under your chairship, Mr Pritchard. It was interesting to hear from my hon. Friend the Member for Sunderland Central (Lewis Atkinson) about the experience of smaller hobby sites and their concerns about the Online Safety Act. I am sure that Ofcom and the Government will listen to those.
The Online Safety Act is not about controlling speech or about the Government deciding what adults think or read or say online, but about responsibility. More specifically, it is about whether we are prepared to say that the online world should have the same safety features as the offline world—whether we expect the online world to be a wild west or not. A lot of the opposition to the Online Safety Act has centred on the freedoms of adults, which I appreciate are important. Adults must be free to build their online lives as they see fit. However, that does not trump the right of children to be safe, whether online or offline, and rights are always a matter of balance.
Before I go further, it is worth being clear about what the Act actually does. It requires online services to assess the risk of harm on their platforms and put proportionate systems in place to reduce those risks. That includes harm from illegal content, such as child sexual abuse material, and harm when children are able to access content such as pornography or material that promotes suicide or self-harm. Alongside that, the Act contains proactive requirements to protect freedom of expression, and the largest platforms are now legally required to continually assess how their decisions affect users’ ability to speak freely online. That obligation is explicit and enforceable.
In many ways, the principles behind the Act are not new. Technology companies have moderated speech and removed content from their platforms since the very beginning. The difference is that, until now, those decisions were driven by opaque corporate priorities, not a clear and accountable framework of public harm.
The stakes here are high. These are some of the first young people whose entire life has been permeated by the online world. It shapes their values, relationships and mental health. For many children, when it comes to sex, self-harm or body image, the first place they turn is not a parent, a teacher or a GP; it is the internet.
I want to talk today about pornography. I think we all accept without controversy that children should not be able to access pornography offline—an adult entertainment shop does not let a 12-year-old walk in and buy a dirty video with their pocket money—but when it comes to internet pornography, we as a society have allowed children to freely access material that they are simply not mature enough to deal with. Pornography is more violent and more dangerous than ever before. Despite that, it has never been easier for children to access it. The door to the store has been wide open for too long.
According to a 2023 report by the Children’s Commissioner—before the Online Safety Act came into force—the vast majority of children surveyed said that they had seen pornography online by accident, through websites such as X, formerly known as Twitter. Kids were not even needing to seek it out; it was being fed to them. When they did seek it out, dedicated sites did not put up any barriers. The previous requirements for websites such as Pornhub were simply for someone to enter a date of birth, which meant the sole access requirement was the ability to subtract 18 from the current year. I think we all know that is not good enough.
That matters because online pornography is not passive; it teaches. It shapes how children understand sex, intimacy, power and consent. It sets expectations long before young people have the tools to question or contextualise what they are seeing. According to that same report by the Children’s Commissioner, more than half of respondents said they had seen pornography involving strangulation, and 44% reported seeing depictions of rape, many of which involved people who were apparently asleep.
Such content does not stay onscreen; it spills into real life. The Children’s Commissioner’s research showed that frequent exposure to violent sexual material is associated with a higher tolerance of sexual aggression, distorted ideas about consent and an increased likelihood of sexually aggressive behaviour. Almost half of young girls surveyed expected sex to involve physical aggression. What children learn online does not disappear when the browser closes.
With the Online Safety Act, for the first time, adult content is being age-restricted online in the same way it is offline, and sites must now use effective age verification tools. That includes third party services, which should use privacy preserving techniques to confirm users’ data without sharing personal information with the platform itself. Since the new law came into effect, Ofcom has been monitoring compliance, and many of the most visited pornography sites have introduced highly effective age checks. I will be honest: I really do not have a lot of sympathy for pornography users who object to having their age verified. If they are bothered about their privacy, they can just not use it. Pornography is not a human right; people can choose not to use it.
Pornography is not the only harm that the Act addresses: for years, platforms such as Twitter, Tumblr and TikTok have hosted vast amounts of content related to self-harm and suicide—some of it framed as support, but much not. Posts and forums provide copious instructions on how to self-harm: the implements to use, how best to hide it and where to cut to do the most damage without killing oneself. Some children accessed that content entirely by accident, before even knowing what self-harm is, while others found it when they were already struggling, and were pulled deeper into it by algorithms that reward repetition and intensity. That content not only risks normalising those behaviours; it risks glamorising them.
So many adults have no idea what is out there, and because they are not fed it on their own feeds, they do not understand the danger and the extremism. Investigations have shown that teenage accounts engaging with suicide, self-harm or depression content were then flooded with more of the same. A single click could trigger what one report from the Molly Rose Foundation described as
“a tsunami of harmful content”.
I am not saying that we should shut down places that offer support to young people who have urges to self-harm, but we need to make sure that young people can access evidence-based support and are not exposed to content that could encourage harm. That is why organisations such as Samaritans have praised the Online Safety Act.
Under the Act, platforms that recommend or promote content to users—for example, “For You” feeds on TikTok—must ensure that those systems do not push harmful content to children. Not only does that put the onus on platforms to prevent children from seeing such content, but means that, if children do come across or search for harmful content, platforms should avoid showing them more of the same so they do not go down a very harmful rabbit hole.
Clearly, it is still early days. The legislation includes a formal review, with a report to Parliament due within a few years of full implementation. We will, and should, look closely at what is working and what needs to be improved—as lawmakers, we have that responsibility—but the signs are encouraging. Sky News spoke to six teenagers before and after the new rules came into force, and five of them said that they were seeing much less harmful content in their feeds. I know that is anecdata, but it is important to listen to the experiences of young people.
Ofcom has opened investigations, and benefits have already come from them. For example, following an Ofcom investigation, file-sharing services that were being used to distribute child sexual abuse material have now installed automated technology to detect and remove such material. Proportionality is at the heart of it, and Ofcom has developed guidance to support compliance. I understand the concerns about smaller or volunteer-run forums, but some of the most harmful content appears on very small or obscure sites, so simply taking out smaller sites would be a disservice.
I am sure there will be problems that must be worked out. We should continue to explore how best to provide children with age-appropriate experiences online, and think about how to get age verification right. But while we refine and improve the system, we cannot ignore the reality that there have been serious harms and that we have a responsibility to tackle them. For the first time, the UK has a regulatory framework that forces tech companies to assess risk, protect freedom of expression and give the public far greater transparency on how decisions about online content are made.
Other countries have banned young people from social media. I have been thinking about that a lot, and I currently do not think it is the right thing to do. Online communities can provide friendship and solace to young people—particularly those who are marginalised, perhaps due to their sexual orientation, or who are restricted in life, perhaps because they are kept at home by ill health or disabilities. Online communities can offer a lot to our young children, but children have a right to be just that: children. They should not have to deal with the complexities and hardships of adult life, so we as adults must do what we can to build safe online spaces for them, just as we build safe physical spaces.
Emily Darlington
I completely agree and I am going to come to that.
I recently met the NSPCC, the Internet Watch Foundation and the police forces that deal with this issue, and they told me that there are easy technological fixes when someone uploads something to a site with end-to-end encryption. For those who do not know, we use such sites all the time—our WhatsApp groups, and Facebook Messenger, are end-to-end encryption sites. We are not talking about scary sites that we have not heard of, or Telegram, which we hear might be a bit iffy; these are sites that we all use every single day. Those organisations told me that, before someone uploads something and it becomes encrypted, their image or message is screened. It is screened for bugs to ensure that they are not sharing viruses, but equally it could be screened for child sexual abuse images. That would stop children even sharing these images in the first place, and it would stop the images’ collection and sharing with other paedophiles.
My hon. Friend the Member for Rugby (John Slinger) is absolutely right: 63% of British parents want the Government to go further and faster, and 50% feel that our implementation has been too slow. That is not surprising; it took seven years to get this piece of legislation through, and the reality is that, by that time, half of it was out of date, because technology moves faster than Parliament.
Lizzi Collinge
My hon. Friend has been talking about the dangers that children are exposed to. Does she believe that parents are equipped to talk to their children about these dangers? Is there more we can do to support parents to have frank conversations with their children about the risks of sharing images and talking to people online?
Emily Darlington
I completely agree. As parents, we all want to be able to have those conversations, but because of the way the algorithms work, we do not see what they see. We say, “Yes, you can download this game, because it has a 4+ rating.” Who knows what a 4+ rating actually means? It has nothing to with the BBFC ratings that we all grew up with and understand really well. Somebody else has decided what is all right and made up the 4+ rating.
For example, Roblox looks as if it is child-ready, but many people might not understand that it is a platform on which anyone can develop a game. Those games can involve grooming children and sexual violence; they are not all about the silly dances that children do in the schoolyard. That platform is inhabited equally by children as it is by adults.