Westminster Hall

Monday 15th December 2025

(1 day, 21 hours ago)

Westminster Hall
Read Hansard Text

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Monday 15 December 2025
[Mark Pritchard in the Chair]

Online Safety Act 2023: Repeal

Monday 15th December 2025

(1 day, 21 hours ago)

Westminster Hall
Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

16:30
Lewis Atkinson Portrait Lewis Atkinson (Sunderland Central) (Lab)
- Hansard - - - Excerpts

I beg to move,

That this House has considered e-petition 722903 relating to the Online Safety Act.

It is a pleasure to serve with you in the Chair, Mr Pritchard, and to open this important debate as a member of the Petitions Committee. I start by thanking the 550,138 people who signed the petition for their engagement with the democratic process, and in particular the petition creator, Alex Baynham, whom I had the pleasure of meeting as part of my preparations for this debate; he is in the Public Gallery today. My role as a member of the Petitions Committee is to introduce the petition and key contours of the issues and considerations that it touches on, hopefully to help ensure that we have a productive debate that enhances our understanding.

I believe that at the heart of any balanced discussion on this issue is a recognition of two simultaneous features of the online world’s development over the last 30 years. First, there has been the development of incredible opportunities for people to communicate and form bonds together online, which go far beyond the previous limitations of geography and have allowed a huge multiplication of opportunities for such interactions—from marketplaces to gaming to dating. We should welcome that in a free society.

Secondly, the opportunities for harm, hate and illegality have also hugely increased, in a way that previous legislation and regulation was totally unequipped to deal with. That is what prompted the introduction of the Online Safety Act 2023. As the Minister at the time said:

“The Bill is world-leading, and the legislative framework established by it will lead to the creation of a profoundly safer online environment in this country. It will kickstart change where that is sorely needed, and ensure that our children are better protected against pornography and other content that is harmful to them.” —[Official Report, 12 September 2023; Vol. 737, c. 799.]

Although some aspects of the Online Safety Act have been more prominent than others since its introduction, it is important in this debate to recall that there are multiple parts of the Act, each of which could separately be subject to amendment or indeed repeal by Parliament. There was the introduction of a framework placing obligations on in-scope services—for example, social media platforms—to implement systems and processes to reduce the risk of their services being used for illegal activity, including terrorism offences, child sexual exploitation and abuse, and drugs and weapon offences. Those duties have been implemented and enforced since March 2025. Secondly, the Act required services to implement systems and processes to protect under-18s from age-inappropriate content—both content that may be passed from user to user, and content that is published by the service itself, such as pornography sites.

We should recognise that the Online Safety Act implemented measures to regulate a wide range of diverse services, from social media giants to commercial sites, but also online spaces run by charities, community and voluntary groups, and individuals. As the first substantive attempt at regulating safety online, the OSA has brought into regulation many services that have not previously been regulated.

Mr Baynham explained to me that those services lay behind his primary motivation in creating the petition. He was spurred by concerns about the impact of the Online Safety Act on online hobby and community forums of the type he uses. They are online spaces created by unpaid ordinary people in their spare time, focused on the discussion of particular shared interests—games, a film or TV series, or football teams. A number of the administrators of such forums have expressed concern that they now face liabilities and obligations under the Online Safety Act that they are not equipped to meet.

I must declare an interest at this stage. For more than a decade, I have regularly used the Ready To Go—RTG—Sunderland AFC fans’ messaging boards. They provide thousands of Mackems with online space to discuss the many ups and downs of our football club and associated issues facing the city, with current topics including club finances, “Match of the Day” tonight and, following a successful Wear-Tyne derby yesterday, “The Mag meltdown” thread.

I heard directly from the administrator of the RTG forum in preparation for this debate. He told me that he came close to shutting the site down when the Online Safety Act came into force and has still not ruled that out completely. He points out that there have been thousands of pages of guidance issued by Ofcom on the implementation of the Act, and that, while tech companies with large compliance teams have the capacity to process that volume of guidance, having volunteers do the same is a huge challenge.

Ofcom has stressed that it will implement the Act in a way that is risk-based and proportionate, and has offered a digital toolkit targeted at small services in response. But even for the smaller sites the rules seem to require, for example, a separate and documented complaints system beyond the usual reporting functionality that small forums have often had in place. The administration of that system has been described to me as time-consuming and liable to being weaponised by trolls.

Some forum hosts feel that the uncertainty regarding the liability they face under the Online Safety Act is too much. The reassurance offered that prosecution is “unlikely” has not given sufficient confidence to some who have been running community sites as volunteers. To some, the risk of liability, personal financial loss or simply getting it wrong has been too great; when the Act came into force, 300 small forums reportedly exited the online space or lost their status as independent forums and migrated to larger platforms such as Facebook.

Iqbal Mohamed Portrait Iqbal Mohamed (Dewsbury and Batley) (Ind)
- Hansard - - - Excerpts

The hon. Member is making an extremely passionate and informed speech. While the unintended consequences of the Online Safety Act on the small forums and specialist groups that he highlights are critical, does he agree that a balance needs to be struck, whereby under-age children are protected from harmful content on whatever forum or website they are exposed to?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - - - Excerpts

I absolutely agree with the hon. Gentleman, and he will not be surprised that I will come on in my speech to deal with some wider issues about the Online Safety Act, in particular the protection of children. I think that today’s debate is likely to be more nuanced than simply whether we should maintain or repeal the Online Safety Act, and we will talk about the implementation and potential evolution of the Act over time.

The ask that I have heard from administrators of small forums is that Ofcom take further steps to simplify the record-keeping and risk-assessment burdens for small sites. When I have met with other organisation such as the Open Rights Group in preparation for this debate, they have suggested that exemptions be made for small and low-risk sites.

It is clear that a size-only exemption would not be appropriate; unfortunately, there have been small platforms specifically to host harmful content, such as forums dedicated to idealising suicide or self-harm, but it is possible that some combination of size and risk could be considered together. These questions touch at the heart of how we maintain the positives that come from vibrant and plural internet spaces while also clamping down on online harms.

Anneliese Dodds Portrait Anneliese Dodds (Oxford East) (Lab/Co-op)
- Hansard - - - Excerpts

Like my hon. Friend, I want to pay tribute to site managers and moderators; I am sad indeed that an incredible example of that function from my city of Oxford, Maggie Lewis, has passed away. She was an incredible presence online for the community and did much other community and charity work.

I looked at some of the small websites that had apparently had issues because of the Act. I found one that was an internet forum known for its open discussion and encouragement of suicide and suicide methods. I found another community website that had allegedly shut down, but is still functioning and has a forum where local people can let others know what is happening in the community—just one element of it had had to close. Does my hon. Friend agree that it is important that, when looking at the regulatory burden, we argue on the basis of facts to make the right decision?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - - - Excerpts

My right hon. Friend is absolutely right. I think, as a society, we want forums such as the ones she reports to close down—they have been harmful. But I recognise that there were others that, maybe pre-emptively, decided to shut down. Perhaps the Minister has further information on how far the reported close-downs were a one-off event, in pre-emption, rather than an ongoing, repeated loss of online spaces.

As I have outlined, we are getting at a more nuanced position from owners and operators of bona fide community forums who are concerned about how to ensure that they are meeting their obligations—in the same way that any person would meet obligations such as those under the Data Protection Act 2018, which has always applied. That is a more nuanced position, far from asking for a full-out repeal of the OSA, but rather asking how the obligations under the Act can be carried out in a proportionate manner.

Peter Fortune Portrait Peter Fortune (Bromley and Biggin Hill) (Con)
- Hansard - - - Excerpts

I thank the hon. Member for introducing the debate—and, as somebody who shares a house with a Newcastle fan, I thank him for a miserable weekend. It is important that we get the safety elements and aspects of the Online Safety Act correct, but does he agree that it should not be used as a blunt tool to stifle freedom of speech online?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - - - Excerpts

I do, but I will come to some of the issues regarding social media platforms in what I am about to say. I certainly would not want to stifle the freedom of speech of Newcastle fans expressing their genuine heartfelt sorrow about yesterday’s events.

I turn now to wider concerns that have been expressed about the Online Safety Act, which, although they are not the motivations of the petition creator, are undoubtedly held by a number of people who signed the petition. The number of petition signatories notably increased in the immediate aftermath of the implementation of age verification requirements that have been applied to significant parts of the internet, from pornography to some elements of social media. Here, I am afraid I find it significantly harder to provide balance in my introduction to the debate, having read the report by the Children’s Commissioner that was published in advance of the implementation of the OSA, which stated:

“It is normal for children and young people to be exposed to online pornography”,

as 70% of children surveyed responded that they had seen pornography online. The report also found:

“Children are being exposed at very young ages…the average age a child first sees pornography online is 13…More than a quarter…of respondents had seen online pornography by the age of 11.”

Lola McEvoy Portrait Lola McEvoy (Darlington) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making a clear and coherent speech. I surveyed 1,000 young people in my constituency, and the forum leads of my online safety forum said that they had found graphic and disturbing content, which they had never searched for, regularly fed to them through the algorithms. Does the hon. Member agree that that is robbing children of their childhood and that age verification needs to be stronger, not weaker, as a result of the 2023 Act?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - - - Excerpts

I agree that there is significant work to be done to effectively implement the OSA. I will touch on that, and the Minister may wish to do so in his response.

Crucially, the report by the Children’s Commissioner found that children were most likely to see pornography by accident—a key point that some of the criticism of the Act fails to grasp. The horrifying statistics, showing the scale of online harm to children that the OSA is working to reduce, make it obvious why in a recent survey 69% the public backed the introduction of age verification checks on platforms, and why children’s charities and children’s rights organisations overwhelmingly back the OSA and—to my hon. Friend’s point—want it implemented more rapidly and robustly.

I have heard that some petition signatories are particularly concerned about age verification on platforms, such as X, Reddit or Discord, beyond those specifically designed as pornography sites. However, the report by the Children’s Commissioner shows that eight out of 10 of the main sources where children saw pornography were not porn sites; they were social media or networking sites. Those platforms that choose to allow their users to upload pornographic content—some do not—should be subject to the same age-verification requirements as porn sites in order to keep our children safe.

Following the implementation of those provisions of the Online Safety Act, it was reported that UK traffic to the most popular pornographic websites was notably down. Yes, it was initially reported that there had been in spike in the number of virtual private networks, or VPNs, being downloaded for access to those sites, but research increasingly suggests it is likely that that trend was being driven by adults worried about their anonymity, rather than by children seeking to circumvent the age limitations.

The Online Safety Act addresses harms beyond those done by porn. Content that is especially harmful to children and that children should not have access to includes very violent content and content encouraging limited eating or suicide.

Amanda Hack Portrait Amanda Hack (North West Leicestershire) (Lab)
- Hansard - - - Excerpts

Looking at those algorithms is a really important part of the Online Safety Act. When I was a county councillor looking at public health, I did a piece of work on disordered eating, and I was bombarded with content. I am not a vulnerable young person or a vulnerable adult, but my real fear is that that information is seen by people who are not as capable of managing that content. Does my hon. Friend agree that algorithm work is a key part of the Online Safety Act?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - - - Excerpts

My hon. Friend is right. The proactive duty that the Act places on providers in relation to the nature of their algorithms and their content is crucial because of the type of content to which she refers. It is right that the largest providers, and those most frequently used by kids, have to take active responsibility for keeping children safe. The implementation of the OSA means that algorithms serving harmful content to kids are now being regulated for the first time. There is a long way to go, and I am sure that other Members will say more than I can in this introduction, but I want to be clear to my constituents that I support the action that the OSA is prompting to improve children’s safety and welfare online.

Various surveys set out the impact of the Online Safety Act; Ofcom is publishing its research and a formal Government review will follow in due course. However, most impactful for me was seeing a teenage boy say on a news piece recently that, now,

“when I’m scrolling TikTok, I’m free from violence.”

That changed for him in the months following the implementation of the Online Safety Act, so it is no wonder that organisations such as the Online Safety Act Network, which I spoke to in preparation for this debate, fully support the Act’s principles. The network points to early evidence that the Act is actively reducing harm to children and emphasised that Ofcom must move beyond content filters to ensure safety by design, which would, for example, include addressing features that incentivise pile-ons, targeting an individual with abuse and harassment.

New Ofcom research shows that 58% of parents now believe that measures in the code of practice are beginning to improve the safety of children online. My belief is that we should be considering not whether to repeal the Act, but how we can continue to enforce it in a robust, effective and proportionate manner.

The way in which the Online Safety Act addresses online hate has perhaps not had as much focus as it might have. As well as being a member of the Petitions Committee, I am privileged to be a member of the Home Affairs Committee, which is conducting an inquiry into combating new forms of extremism. It is very clear from the public evidence that we have received so far that, left unregulated and unchallenged, online spaces and services can be used to amplify hate, thus risking a rise in extremist action, including violence.

Analysis by the Antisemitism Policy Trust highlights that there are patterns of co-ordinated and persistent misogynistic, anti-immigrant, anti-Government and antisemitic discourse on social media, with bot accounts being repeatedly used to amplify misleading or harmful narratives that fuel hate and may increase the risk of violence. Such content often breaches platforms’ own terms of service, but under the Online Safety Act, I understand that Ofcom category 1 services will now be mandated to proactively offer users optional tools to help them to reduce the likelihood that they will encounter legal but harmful content such as that.

There is much to be done to implement those provisions in an appropriate manner. However, I invite anyone calling for full repeal of the Act to consider how we as a society deal with the rise of extremism, and a context where the internet can be used as a sort of free-for-all fuelled by hate-filled algorithms that thrive on and incentivise division and hatred, rather than consensus and civic peace.

I am aware that there are large parts of the Online Safety Act that I have not been able to touch on today; I hope that others will do so during the debate. There are questions about end-to-end encryption, cyber-flashing, the creation of abusive deepfakes, AI moderation and chatbots.

Manuela Perteghella Portrait Manuela Perteghella (Stratford-on-Avon) (LD)
- Hansard - - - Excerpts

The hon. Member is making a strong and thoughtful case. Does he agree that although the Act regulates user-to-user services, it leaves a significant gap around generative AI chatbots, despite the growing evidence of harm caused to children from private interaction with them? And does he share my concern that the speed at which this technology is developing risks outpacing the legislative framework that we have in place?

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - - - Excerpts

I agree with the hon. Lady. In my understanding, when the legislation was drafted, it was not initially clear to those who drafted it that AI would develop at the astonishing pace that it has in recent years. I ask the Minister to reflect on that point in addressing the implementation of the Act and its potential future evolution through primary legislation.

Lola McEvoy Portrait Lola McEvoy
- Hansard - - - Excerpts

I thank my hon. Friend for giving way and for being so generous with his time. Can we also pass on to the Minister that, going forward, there is a possibility to brand bots? That would require the Online Safety Act to be amended to make sure that any profile that is a bot—generated by AI—is explicitly marketed as such, which would protect users as AI advances.

Lewis Atkinson Portrait Lewis Atkinson
- Hansard - - - Excerpts

My hon. Friend makes that point well, and the Minister will have heard it.

As this discussion continues, I hope that we can find a way of reflecting these two areas of balance—these two features of the online world now. First, there is the absolute primacy of safeguarding children and tackling serious online harms, but it is also important to recognise the real benefits that living in an increasingly connected society bring us all. I think those are very much the motivations of the petition’s creator—we are talking about the work done by good, civic-minded folk, and creators and administrators of online communities and hobby forums across the country. Naturally, as our learning about the implementation of the Act continues, there is a way of doing that that supports the efforts of those people without risking such sites being used to further online harms.

The consensus, I think it is fair to say, is that reform of the Act, rather than repeal, is the realistic route forward. That is natural with such groundbreaking legislation, but reform must be sensitive to the scale, proportionality and privacy, as well as the emerging and changing nature of online harms. I thank Members for their time and their interventions, and I look forward to a positive debate.

Mark Pritchard Portrait Mark Pritchard (in the Chair)
- Hansard - - - Excerpts

I remind colleagues that if they wish to speak, they should bob—quite a few colleagues are bobbing already, so thank you for that.

16:53
Ann Davies Portrait Ann Davies (Caerfyrddin) (PC)
- Hansard - - - Excerpts

Diolch yn fawr, Mr Pritchard. It is a pleasure to serve under your chairmanship. The Online Safety Act certainly has its weaknesses, but I do not believe that it should be abolished. This law has made progress in protecting children online. Scrapping it would throw them right back to well-known harms.

I will briefly focus my remarks on one area where the Act is not adequate: AI chatbots. AI chatbots have developed rapidly in recent years and are becoming ingrained in our children’s lives. Let me give hon. Members a few figures. One in four children aged 13 to 17 in England and Wales have turned to AI chatbots for mental health support. Vulnerable children are even more at risk: 26% say they would rather talk to an AI chatbot than a real person, and 23% say they use chatbots because they do not have anyone else to talk to. Children do not have anyone else to talk to—this is the society we are creating.

Lola McEvoy Portrait Lola McEvoy
- Hansard - - - Excerpts

The Government launched the youth strategy last week, the first in over two decades. It was on the back of stark research that found that one in four children growing up today do not have a trusted adult they can reach out to. Does the hon. Lady agree that functioning AI could be put to good use in the NHS? It could support signposting and make sure that children can get to the charities doing great work to support them, rather than giving them algorithm-based advice?

Ann Davies Portrait Ann Davies
- Hansard - - - Excerpts

Absolutely. Personally, I think that the algorithms in the system are a disaster. Wales is very different from England, so I have to be careful that I am not treading on the toes of the Senedd, because it does excellent work on youth services, in fairness. In my Caerfyrddin constituency, we have a number of youth projects that are doing really well, including Dr Mz, which provides services to over 500 children every week who come through its doors. Surely a person-to-person conversation is so much better than looking for something online, because we do not know what is coming through the chatbot. This is my main concern.

I have mentioned the scale of the issue that we are facing. While I appreciate that a multifaceted approach is crucial to ensure that our children are safe and thriving, we cannot afford to get this wrong. Ofcom and the Secretary of State have acknowledged that AI chatbots mostly fall outside the scope of the Online Safety Act. I welcome the announcement from the Secretary of State that the Government are exploring the tougher regulation of AI chatbots, and I have asked Ofcom to clarify expectations for any that are covered by the Act, alongside a public information campaign coming next year. However, I am concerned that we are not moving at the pace or with the sense of urgency needed to get a real handle on this issue.

Can the Minister share more specific details about the Government’s plans and a timeline for implementing tougher regulation of AI chatbots? Online safety for children is a priority for all of us, and I hope that Members across the House can agree that this is a shared goal that must not be politicised. Diolch.

16:57
Jim McMahon Portrait Jim McMahon (Oldham West, Chadderton and Royton) (Lab/Co-op)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Pritchard. The new media has quickly become the worst of the old media: owned, controlled and directed by the wealthy and powerful. My particular focus will be on social media, because it is no longer a movement of the people, nor has it been built or designed for the public good. It had the potential to be, but it has been deliberately designed not to be, and we are paying the price, with real harm, hate, division, exploitation and extremism normalised.

I hear the petitioners’ concerns about the impact on community forums, but the truth is that online regulation does not go anywhere near far enough. That is because the previous Government failed to take the action that was needed. For instance, there is no fit and proper persons test—there should be. There is no editorial responsibility for content on the platforms—there should be. There is no adequate protection from malign foreign influence—there should be. There is no protection from disinformation —there should be. There are no meaningful safeguards against racism, misogyny or hate—there should be. There are no restrictions on Members of Parliament monetising content they produce—instead, they should post solely in the public interest, rather than generating income into their bank account—and there should be. There is no transparency on algorithms, nor the need to declare in-kind benefit in politics in the way that there is in almost every other aspect of political gain—again, there should be.

As it stands, truth, democracy and the safeguarding of the public interest are under threat. The previous Government ducked it, offering a watered-down version that was backed up by a toothless regulator. We have seen what is possible when red lines are drawn; Australia has decided that the welfare of its children is more important than the interests of the powerful and the wealthy. That is leadership.

The UK’s failure to stand up to powerful vested interests has played right into the hands of foreign forces who wish harm on our country, our way of life and our democracy. Technology is moving fast, as we are seeing with AI, and frankly, lawmakers need to be much sharper and quicker to keep up. The first duty of any Government is to protect the national security of their citizens, so for the Government, the question is simply this: when will they start to fight on this new front with vigour and finally do what the previous Government failed to do?

17:00
Iqbal Mohamed Portrait Iqbal Mohamed (Dewsbury and Batley) (Ind)
- Hansard - - - Excerpts

It is a pleasure to speak with you in the Chair, Mr Pritchard. I thank the hon. Member for Sunderland Central (Lewis Atkinson) for his powerful and eloquent introduction to this important debate. The scale of this petition should make us reflect: over half a million people have called for the repeal of the Online Safety Act, not because online safety is unpopular, but because they believe that the legislation does not yet strike the right balance.

Let me be clear: the Online Safety Act exists for a reason. I stand in strong support of its intent, aims and objectives, and I am not in favour of its repeal. For too long, online platforms have failed to protect users, particularly children, from serious harm. The statistics are sobering: nearly one in five children aged 10 to 15 have exchanged messages with someone they have never met; over 9,000 reported child sexual abuse offences in 2022-23 involved an online element; and, in recent years, we have seen tragic cases where exposure to harmful online content has contributed to devastating outcomes. Repealing the Act would leave us with very little meaningful protection, so it remains central for regulating online spaces in the UK. We must accept that necessary truth, although it is a hard pill to swallow.

Supporting the Act, however, does not mean ignoring the parts that need important improvements. One of the most significant concerns is age restriction. Age-gating can and should play a role in protecting children from genuinely harmful content, but it is increasingly clear that the boundaries of age restrictions are not defined well. There is growing evidence that lawful political content, including news and commentary on conflicts such as Gaza, Ukraine and Sudan, is being placed behind age gates.

Teenagers aged 16 and 17 are finding themselves blocked from accessing political information and current affairs, sometimes more strictly than in film and television content regulated by the British Board of Film Classification. That should give us pause, particularly when the House is considering extending the vote to 16-year-olds. If we believe that young people should be active participants in our democracy, we cannot also allow systems that restrict their access to political debate by default, just because these are difficult and sensitive topics. What is or is not age-restricted needs to be far clearer, more consistent and more proportionate.

The second area where clarity is urgently needed is generative AI. As we are having this debate, the Home Secretary is making a statement on violence against women and girls, which she has rightly described as a “national emergency”. The Government’s five-year national strategy acknowledges the growing threat posed by intimate deepfakes, with one survey by the National Society for the Prevention of Cruelty to Children showing that three in five people fear becoming a victim. With current laws proving too difficult to apply in complex and rapidly evolving cases, what specific legislative proposals are the Government hoping to develop to address deepfake abuse?

When this legislation was drafted and passed, the pace of AI development was very different. Today, AI tools and chatbots are embedded across social media, search engines and messaging platforms, with people relying on ChatGPT, Gemini and Copilot as search engines and virtual assistants embedded into almost every online service we use. They can generate harmful and misleading content within seconds, including advice related to self-harm, eating disorders, substance misuse and suicide assistance.

Only last week, I led a debate in Westminster Hall on the need for stronger AI regulation. That debate reinforced a growing concern that many AI-driven services currently sit at the edges of the Online Safety Act. Although Ofcom has acknowledged that gap and issued guidance, guidance alone is not enough. We need clarity on how generative AI is regulated and whether further legislative action is required to keep pace with the technology.

The message of this petition is not a rejection of online safety; it is a call for a system that protects children while safeguarding freedom of expression, political engagement and public trust. The challenge before us is not to repeal, but to refine by strengthening definitions, clarifying age restrictions and ensuring that the Online Safety Act evolves alongside emerging technologies. If we get that right, we can protect users online without undermining the democratic values we try to defend.

17:06
Lizzi Collinge Portrait Lizzi Collinge (Morecambe and Lunesdale) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Mr Pritchard. It was interesting to hear from my hon. Friend the Member for Sunderland Central (Lewis Atkinson) about the experience of smaller hobby sites and their concerns about the Online Safety Act. I am sure that Ofcom and the Government will listen to those.

The Online Safety Act is not about controlling speech or about the Government deciding what adults think or read or say online, but about responsibility. More specifically, it is about whether we are prepared to say that the online world should have the same safety features as the offline world—whether we expect the online world to be a wild west or not. A lot of the opposition to the Online Safety Act has centred on the freedoms of adults, which I appreciate are important. Adults must be free to build their online lives as they see fit. However, that does not trump the right of children to be safe, whether online or offline, and rights are always a matter of balance.

Before I go further, it is worth being clear about what the Act actually does. It requires online services to assess the risk of harm on their platforms and put proportionate systems in place to reduce those risks. That includes harm from illegal content, such as child sexual abuse material, and harm when children are able to access content such as pornography or material that promotes suicide or self-harm. Alongside that, the Act contains proactive requirements to protect freedom of expression, and the largest platforms are now legally required to continually assess how their decisions affect users’ ability to speak freely online. That obligation is explicit and enforceable.

In many ways, the principles behind the Act are not new. Technology companies have moderated speech and removed content from their platforms since the very beginning. The difference is that, until now, those decisions were driven by opaque corporate priorities, not a clear and accountable framework of public harm.

The stakes here are high. These are some of the first young people whose entire life has been permeated by the online world. It shapes their values, relationships and mental health. For many children, when it comes to sex, self-harm or body image, the first place they turn is not a parent, a teacher or a GP; it is the internet.

I want to talk today about pornography. I think we all accept without controversy that children should not be able to access pornography offline—an adult entertainment shop does not let a 12-year-old walk in and buy a dirty video with their pocket money—but when it comes to internet pornography, we as a society have allowed children to freely access material that they are simply not mature enough to deal with. Pornography is more violent and more dangerous than ever before. Despite that, it has never been easier for children to access it. The door to the store has been wide open for too long.

According to a 2023 report by the Children’s Commissioner—before the Online Safety Act came into force—the vast majority of children surveyed said that they had seen pornography online by accident, through websites such as X, formerly known as Twitter. Kids were not even needing to seek it out; it was being fed to them. When they did seek it out, dedicated sites did not put up any barriers. The previous requirements for websites such as Pornhub were simply for someone to enter a date of birth, which meant the sole access requirement was the ability to subtract 18 from the current year. I think we all know that is not good enough.

That matters because online pornography is not passive; it teaches. It shapes how children understand sex, intimacy, power and consent. It sets expectations long before young people have the tools to question or contextualise what they are seeing. According to that same report by the Children’s Commissioner, more than half of respondents said they had seen pornography involving strangulation, and 44% reported seeing depictions of rape, many of which involved people who were apparently asleep.

Such content does not stay onscreen; it spills into real life. The Children’s Commissioner’s research showed that frequent exposure to violent sexual material is associated with a higher tolerance of sexual aggression, distorted ideas about consent and an increased likelihood of sexually aggressive behaviour. Almost half of young girls surveyed expected sex to involve physical aggression. What children learn online does not disappear when the browser closes.

With the Online Safety Act, for the first time, adult content is being age-restricted online in the same way it is offline, and sites must now use effective age verification tools. That includes third party services, which should use privacy preserving techniques to confirm users’ data without sharing personal information with the platform itself. Since the new law came into effect, Ofcom has been monitoring compliance, and many of the most visited pornography sites have introduced highly effective age checks. I will be honest: I really do not have a lot of sympathy for pornography users who object to having their age verified. If they are bothered about their privacy, they can just not use it. Pornography is not a human right; people can choose not to use it.

Pornography is not the only harm that the Act addresses: for years, platforms such as Twitter, Tumblr and TikTok have hosted vast amounts of content related to self-harm and suicide—some of it framed as support, but much not. Posts and forums provide copious instructions on how to self-harm: the implements to use, how best to hide it and where to cut to do the most damage without killing oneself. Some children accessed that content entirely by accident, before even knowing what self-harm is, while others found it when they were already struggling, and were pulled deeper into it by algorithms that reward repetition and intensity. That content not only risks normalising those behaviours; it risks glamorising them.

So many adults have no idea what is out there, and because they are not fed it on their own feeds, they do not understand the danger and the extremism. Investigations have shown that teenage accounts engaging with suicide, self-harm or depression content were then flooded with more of the same. A single click could trigger what one report from the Molly Rose Foundation described as

“a tsunami of harmful content”.

I am not saying that we should shut down places that offer support to young people who have urges to self-harm, but we need to make sure that young people can access evidence-based support and are not exposed to content that could encourage harm. That is why organisations such as Samaritans have praised the Online Safety Act.

Under the Act, platforms that recommend or promote content to users—for example, “For You” feeds on TikTok—must ensure that those systems do not push harmful content to children. Not only does that put the onus on platforms to prevent children from seeing such content, but means that, if children do come across or search for harmful content, platforms should avoid showing them more of the same so they do not go down a very harmful rabbit hole.

Clearly, it is still early days. The legislation includes a formal review, with a report to Parliament due within a few years of full implementation. We will, and should, look closely at what is working and what needs to be improved—as lawmakers, we have that responsibility—but the signs are encouraging. Sky News spoke to six teenagers before and after the new rules came into force, and five of them said that they were seeing much less harmful content in their feeds. I know that is anecdata, but it is important to listen to the experiences of young people.

Ofcom has opened investigations, and benefits have already come from them. For example, following an Ofcom investigation, file-sharing services that were being used to distribute child sexual abuse material have now installed automated technology to detect and remove such material. Proportionality is at the heart of it, and Ofcom has developed guidance to support compliance. I understand the concerns about smaller or volunteer-run forums, but some of the most harmful content appears on very small or obscure sites, so simply taking out smaller sites would be a disservice.

I am sure there will be problems that must be worked out. We should continue to explore how best to provide children with age-appropriate experiences online, and think about how to get age verification right. But while we refine and improve the system, we cannot ignore the reality that there have been serious harms and that we have a responsibility to tackle them. For the first time, the UK has a regulatory framework that forces tech companies to assess risk, protect freedom of expression and give the public far greater transparency on how decisions about online content are made.

Other countries have banned young people from social media. I have been thinking about that a lot, and I currently do not think it is the right thing to do. Online communities can provide friendship and solace to young people—particularly those who are marginalised, perhaps due to their sexual orientation, or who are restricted in life, perhaps because they are kept at home by ill health or disabilities. Online communities can offer a lot to our young children, but children have a right to be just that: children. They should not have to deal with the complexities and hardships of adult life, so we as adults must do what we can to build safe online spaces for them, just as we build safe physical spaces.

17:16
Tom Collins Portrait Tom Collins (Worcester) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Mr Pritchard.

At its birth, the internet was envisaged as a great advancement in a free society: decentralised, crowdsourced and open, it would share knowledge across humanity. As it grew, every one of us would own a platform and our voice. Of course, since then bandwidth has increased massively, which means that we now experience a rich variety of media. Storage and compute have increased by many orders of magnitude, which has created the power of big data, and generative capabilities have emerged quite recently, creating a whole new virtual world. Services no longer simply route us to what we were searching for but offer us personalised menus of rich media, some from human sources and some generated to entertain or meet demands.

We are now just starting to recognise the alarming trends that we are discussing today. Such rich media and content has become increasingly harmful. That compute, storage and big data power is being used to collect, predict and influence our most private values, preferences and behaviours. Generative AI is immersing us in a world of reconstituted news, custom facts and bots posing as people. It increasingly feels like a platform now owns every one of us and our voice.

Harms are dangerously impacting our young people. Research from the Centre for Countering Digital Hate illustrates some of the problems. On YouTube, the “Next Video” algorithm was found to be recommending eating disorder content to the account of a UK-based 13-year-old female. In just a few minutes, the account was exposed to material promoting anorexia and weight loss, and more than half the other recommended videos were for content on eating disorders or weight loss.

On TikTok, new teen accounts were found to have been recommended self-harm and eating disorder content within minutes of scrolling the “For You” feed. Suicide content appeared within two and a half minutes, and eating disorder content within eight. Accounts created with phrases such as “lose weight” received three times as many of these videos as standard teen accounts, and 12 times as many self-harm videos. Those are not isolated incidents, and they show the scale and speed at which harmful material can spiral into exponential immersion in worlds of danger for young people.

On X, formerly known as Twitter—a trigger warning for anybody who has been affected by the absolutely appalling Bondi beach Hanukkah attack—following the Manchester synagogue attack, violent antisemitic messages celebrating and calling for further violence were posted and left live for at least a week. ChatGPT has been shown to produce dangerous advice within minutes of account creation, including guidance on self-harm, restrictive diets and substance misuse.

I am grateful to hon. Friends for raising the topic of pornography. I had the immense privilege of being at an event with a room full of men who spoke openly and vulnerably about their experiences with pornography: how it affected their sex lives, their intimacy with their partners or wives, their dynamics of power and respect, and how it infused all their relationships in daily life. They said things such as, “We want to see it, but we don’t want to want to see it.” If adult men—it seems from this experience, at least, perhaps the majority of adult men—are finding it that hard to deal with, how can we begin to comprehend the impact it is having on our children who come across it accidentally?

This can all feel too big to deal with—too big to tackle. It feels immense and almost impossible to comprehend and address. Yet, to some, the Online Safety Act feels like a sledgehammer cracking a nut. I would say it is a sledgehammer cracking a deeply poisonous pill in a veritable chemistry lab of other psychoactive substances that the sledgehammer completely misses and will always be too slow and inaccurate to hit. We must keep it, but we must do better.

As an engineer, I am very aware that since the industrial revolution, when physical machines suddenly became immensely more powerful and complex, a whole world of not just regulations but technical standards has been built. It infuses our daily lives, and we can barely touch an object in this room that has not been built and verified to some sort of standard—a British, European or global ISO standard—for safety. We should be ready to reflect that model in the digital world. A product can be safe or unsafe. We can validate it to be safe, design it to be safe, and set criteria that let us prove it—we have shown that in our physical world since the industrial revolution. So how do we now begin to put away the big, blunt instrument of regulation when the problem seems so big and insurmountable?

John Slinger Portrait John Slinger (Rugby) (Lab)
- Hansard - - - Excerpts

Ofcom officials came before the Speaker’s Conference, of which I am a member, so I declare that interest. They spoke about section 100 of the Act, which gives Ofcom the power to request certain types of information on how, for example, the recommender systems work on the companies’ algorithms. Unfortunately, they said that could be “complicated and challenging to do”, but one thing they spoke about very convincingly was that they want to require—in fact, they can require—those companies to put information, particularly about the algorithms, in the public domain to help researchers. That could really help with the point my hon. Friend is making about creating regulations that improve safety for our population.

Tom Collins Portrait Tom Collins
- Hansard - - - Excerpts

I thank my hon. Friend for his remark. He is entirely right. In my own experience of engineering products, very critically, for safety, it was incumbent upon us to be fully open about everything we had done with those regulating and certifying our products for approval. We had numerous patents on our technology, which was new and emerging and had immense potential and value, yet we were utterly open with those notified bodies to ensure that our products were safe.

Similarly, I was fortunate enough to be able to convene industry to share the key safety insights that we were discovering early on to make sure that no mistake was ever repeated, and that the whole industry was able to innovate and develop in a safe way. I thank my hon. Friend the Member for Rugby (John Slinger) for his comments, and I strongly agree that there is no excuse for a lack of openness when it comes to safety.

How do we move forward? The first step is to start breaking down the problem. I have found it helpful to describe it in four broad categories, including hazards that apply to the individual simply through exposure. This would be content such as pornography, violence and images of or about abuse. And then there are hazards that apply to an individual by virtue of interaction, such as addictive user interfaces or personified GPTs. We cannot begin to comprehend the potential psychological harms that could come to human beings when we start to promote attachment with machines. There is no way we can have evidence to inform how safe or harmful that would be, but I suggest that all the knowledge that exists in the psychology and psychiatric communities would probably point to it being extremely risky and dangerous.

We have discussed recommendation algorithms at length. There are also societal harms that affect us collectively by exposure. These harms could be misinformation or echo chambers, for example. The echo chambers of opinion have now expanded to become echo chambers of reality in which people’s worldviews are increasingly being informed by what they see in those spaces, which are highly customised to their existing biases.

Tom Hayes Portrait Tom Hayes (Bournemouth East) (Lab)
- Hansard - - - Excerpts

I have met constituents to understand their concerns and ambitions in relation to online safety legislation. There is a clear need to balance the protection of vulnerable users against serious online harms with the need to protect lawful speech as we pragmatically review and implement the Act.

My hon. Friend talks about equipping our younger people, in particular, with the skills to scrutinise what is real or fake. Does he agree that, although we have online safety within the national curriculum, we need to support our teachers to provide consistent teaching in schools across our country so that our children have the skills to think critically about online safety, in the same way as they do about road safety, relationships or consent? [Interruption.]

Mark Pritchard Portrait Mark Pritchard (in the Chair)
- Hansard - - - Excerpts

Before we continue, could I ask that everybody has their phone on silent, please?

Tom Collins Portrait Tom Collins
- Hansard - - - Excerpts

Thank you, Mr Pritchard. I agree with my hon. Friend the Member for Bournemouth East (Tom Hayes). I was fortunate enough to meet the Worcestershire youth cabinet, which is based in my constituency. I was struck that one of its members’ main concerns was their online safety. I was ready for them to ask for more support in navigating the online world, but that is not what they asked for. They said, “Please do not try to support us any more; support our adults to support us. We have trusted adults, parents and teachers, and we want to work with them to navigate this journey. Please help them so that they can help us.” I thank my hon. Friend for his excellent point.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

My hon. Friend is making an excellent speech that gets to the heart of some of the tensions. However, he seems to be leaning quite strongly into how the algorithms are self-learning and catch on to what people share organically, which they double down on to commercialise the content. Does he accept that some widely used platforms are not just using an algorithm but are deliberately suppressing mainstream opinion and fact in order to amplify false information and disinformation, and that the people benefiting are those who have malign interests in our country?

Tom Collins Portrait Tom Collins
- Hansard - - - Excerpts

Absolutely. My hon. Friend is right. All those algorithms now have hidden interests, which are sometimes just to increase use, but I think we all strongly suspect that they may stray into political agendas. It is remarkable how powerful that part of the online world is. My personal view is that it is not dissimilar to the R number during covid. If a person sees diverse enough content, their worldview will have enough overlap with other people that it will tend to converge. In the old days, “The Six O’Clock News”, or the news on the radio, provided us with shared content that we all heard, whether we agreed with it or not. That anchored us to a shared narrative.

We are now increasingly in echo chambers of reality where we are getting information that purports to be news and reactions that purport to be from human beings in our communities, both of which reinforce certain views. It is increasingly possible that the R number will become greater than one, and our worldviews will slowly diverge further and further. Such an experiment has never been carried out on a society, but it strikes me that it could be extremely harmful.

While we are exploring this theme, I would like to point to the opposite possibility. In Taiwan, trust in the Government was at 9% when the digital Minister took office. They created a digital platform that reversed the algorithm so that, instead of prioritising content based on engagement—a good proxy for how polarising or divisive something is—it prioritised how strongly content resonated with both sides of the political divide. The stronger a sentiment was in bridging between those two extremes, the more it was prioritised.

Instead of people competing to become more and more extreme, to play to their own audiences, they competed to express sentiments and make statements that bridged the divide more and more. In the end, as the system matured, the Government were able to start to say things like, “Once a sentiment receives 85% agreement and approval, the Government will take it on as a goal. We will work out how to get there, but we will take it as a goal that the public say we should be shooting for.” By the end of the project, public trust in the Government was at 70%. Algorithms are powerful—they can be powerful for good or for ill. What we need to make sure is that they are safe for us as a society. That should be the minimum standard.

Finally, we can imagine harms that apply at a societal level but come through interaction. That comes, I would say, when we start to treat machines as if they are members of our society—as people. When I first started exploring this issue, I thought that we had not seen that yet. Then I realised that we have: bots on social media and fake accounts that we do not know are not human beings. They are not verified as human beings, yet we cannot help but start to believe and trust what we see. I would say that it is only a matter of time before these bots become more and more sophisticated and with more and more of an agenda—more able to build relationships with us and to influence us even more deeply. That is a dangerous threshold, which points to the need for us to deal with the issue in a sophisticated way.

What next? It is critical that we first start to develop tools—technically speaking, these are models—that classify and quantify these hazards to individual people and to us as a society, so that we can understand what is hazardous and what is not. Then, based on that, we can start to build tools and models that allow us to either validate products as safe—they should, I agree, be safe by design—or provide protective features.

Already, some companies are developing protection algorithms that can detect content that is illegal or hazardous in different ways and provide a trigger to an operating system to, for example, mask that by making it blurred or opaque, either at the screen or the camera level. Such tools are rapidly becoming more and more capable, but they are not being deployed. At the moment, there is very little incentive for them to be deployed.

If, for example, we were to standardise in the software environment interfaces or sockets of some kind so that these protective tools could be plugged into operating systems or back ends, we could create a market for developing more and more accurate and capable software.

In the world of physical safety, we use a principle called “state of the art”. In contrast to how we all might understand that term, it does not mean the cutting edge of technology; rather, it means safety features that are common enough that they should be adopted as standard and we should expect to have them. The automotive industry is a great example. Perhaps the easiest feature for me to point to is anti-lock brakes, which started out as a luxury feature in high-end vehicles, but rolled out into more and more cars as they became more affordable and accessible. Now they come as standard on all cars. A car without anti-lock brakes could not be sold because it would not meet the state of the art.

If we apply a similar principle to online protection software, tech companies with capable protections would have a guaranteed market. The digital product manufacturers or service providers would have to keep up; that would drive both innovation and uptake. These are already practised in industry. They cost the public purse nothing and generate growth, high-value jobs and national capabilities. Making the internet safe in the right way does not close it down; it creates freedoms and opens it up—freedom to trust what we are seeing; freedom to use it without being hurt; and freedom to rely on it without endangering our national security.

There is another parallel. We would not dream of building a balcony without a railing, but if we had built one we would not decide that the only way to make it safe was to declare that the balcony was for use only by adults. It still would not be safe. Adults and children alike would inevitably come to harm and many of our regulations would not allow it: in fact, there must be a railing that reaches a certain height and is able to withstand certain forces, and it must be designed with safety in mind and be maintained. We would have an inspection to make sure it was safe. Someone designing or opening a building with an unprotected, unbarriered balcony could easily expect to go to prison. We have come to expect our built environment to be safe in that way; having been made robustly safe for adults, it is also largely safe for children. If we build good standards and regulation, we can all navigate the digital world safely and freely.

Likewise, we need to build the institutions to ensure fast and dynamic enforcement. For services, there are precedents for good enforcement. We have seen great examples of that when sites have not complied, such as TCP ports for payment systems being turned off instantly. That is a really strong motivation for a website to comply. It is fast, dynamic and robust, and is very quickly reversible, as the TCP port can be turned back on and the website can once again accept payments. We need that kind of fast, dynamic enforcement if we are to keep up with the fast and adaptive world working around us.

On the topic of institutions, I would like to point out—I would not be surprised if my hon. Friend the Member for Rugby (John Slinger) expands on this—that when television and radio came into existence, we built the BBC so that we would have a trusted source among those services. It kept us safe, and it also ended up projecting our influence around the world. We need once again to build the institutions or expand them and the infrastructure to provide digital services in our collective interest.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

My hon. Friend is making a very good speech; maybe he should consider a career in TED Talks after this. A number of competitor platforms have been tried, such as Bluesky as an alternative to X, but the take-up is not sustained. I wonder whether the monopoly that some of these online platforms have is now so well embedded that people have become attached to them out of habit. As Members, we must all feel the tension at times about whether we should or should not be on some of these platforms.

There is a need for mainstream voices to occupy these spaces to ensure that we do not concede to extremes of any political spectrum, but we are always going to be disadvantaged if the algorithm drives towards those extremes and not to the mainstream. I just test the principle of an online BBC versus whether or not there should be a more level playing field for mainstream content on existing platforms.

Tom Collins Portrait Tom Collins
- Hansard - - - Excerpts

My hon. Friend is, of course, right. If we regulate for safety, we do not need to worry about the ecosystem needing good actors to displace it. At the same time, however, those good actors would have a competitive and valuable role to play, and I do not want to undervalue the currency of trust. Institutions such as the BBC are so robustly trustworthy that they have a unique value to offer, even if we do manage to create a safe ecosystem or market of online services.

I am convening a group of academics to start trying to build the models I discussed as the foundation for technical standards for safe digital products. I invite the Minister to engage the Department in this work. That is vital for the safety of each of us and our children as individuals, and for the security and resilience of our society. I also invite anybody in the technical space of academia or industry exploring some of these models and tools to get in touch with me if they see this debate and are interested.

Only by taking assertive action across all levels of technical, regulatory and legal governance can we ensure the safety of citizens. Only by expanding our institutions can we provide meaningful enforcement, designing and building online products, tools and infrastructure. If we do those things, the internet will be more open, secure, private, valuable and accessible to all of us. Good regulation is the key to a safe and open internet.

17:39
Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Pritchard. I want to add some actual data to our debate today. We are inundated, often online or in our inboxes, with messages about repealing the Online Safety Act. These are well-funded campaigns. There is also a lot of material online coming from very particular sources, not necessarily within the UK. Actually, 70% of people in the UK support the Online Safety Act and a similar number support age verification. Much of that has to do with what our children are seeing online. Almost 80% of people aged 18 to 21 have seen sexual violence before age 18. That is a huge number of people whose initial sexual experiences or viewing of sex involves violence.

What does the Online Safety Act do? It puts porn back on the top shelf—it does not get rid of it. We are all of an age to remember when porn was on the top of the magazine rack in the corner shop. Now it is being fed to our children in their feeds. The issue is also the type and nature of porn that people are seeing online: 80% of online porn has some kind of strangulation in it. That has real-world consequences, as we have seen from the latest data on women’s health in terms of strokes. Strangulation is now the second leading cause of strokes among women in the UK. That is shocking, and it is why we needed the Online Safety Act to intervene on what was being fed to us.

In Milton Keynes, 30% of young people have been approached by strangers since the implementation of the Online Safety Act. They are most frequently approached on Roblox. We do not automatically identify gaming platforms as places where people are approached by strangers, but we know from police investigations that they approach young children on Roblox and move them to end-to-end encryption sites where they can ask them to share images.

In 2024, there were 7,263 online grooming offences—remember that those will just be the ones that are not in end-to-end encryption sites. There were 291,273 reports of child sexual abuse material identified last year—again, remember, that is not the material being shared on end-to-end encryption sites, because we have no idea what is actually being shared on those. Some 90% of that material is self-generated—that is, groomers asking children to take pornographic pictures of themselves and share them. Once a picture is shared with a groomer, it goes into networks and can get shared anywhere in the UK or the world. The UK is the biggest consumer of child sexual abuse images. The police reckon that 850,000 people in the UK are consuming child sexual abuse images.

John Slinger Portrait John Slinger
- Hansard - - - Excerpts

I thank my hon. Friend for making an impassioned and powerful speech. Does she agree that outrage ought to be directed at us for not doing enough on these issues rather than for the way in which we have started to try to tackle them?

If the behaviours that my hon. Friend and other hon. Members have referred to happened in the real world—the so-called offline world—they would be clamped down on immediately and people would be arrested. Certain items cannot be published, be put in newsagents or be smuggled into school libraries and people could not get away with the defence, “This is a matter of my civil liberty.” We should be far more robust with online companies for the frankly shoddy way in which they are carrying out their activities, which is endangering our children and doing immense damage to our political system and wider life in our country and beyond.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I completely agree and I am going to come to that.

I recently met the NSPCC, the Internet Watch Foundation and the police forces that deal with this issue, and they told me that there are easy technological fixes when someone uploads something to a site with end-to-end encryption. For those who do not know, we use such sites all the time—our WhatsApp groups, and Facebook Messenger, are end-to-end encryption sites. We are not talking about scary sites that we have not heard of, or Telegram, which we hear might be a bit iffy; these are sites that we all use every single day. Those organisations told me that, before someone uploads something and it becomes encrypted, their image or message is screened. It is screened for bugs to ensure that they are not sharing viruses, but equally it could be screened for child sexual abuse images. That would stop children even sharing these images in the first place, and it would stop the images’ collection and sharing with other paedophiles.

My hon. Friend the Member for Rugby (John Slinger) is absolutely right: 63% of British parents want the Government to go further and faster, and 50% feel that our implementation has been too slow. That is not surprising; it took seven years to get this piece of legislation through, and the reality is that, by that time, half of it was out of date, because technology moves faster than Parliament.

Lizzi Collinge Portrait Lizzi Collinge
- Hansard - - - Excerpts

My hon. Friend has been talking about the dangers that children are exposed to. Does she believe that parents are equipped to talk to their children about these dangers? Is there more we can do to support parents to have frank conversations with their children about the risks of sharing images and talking to people online?

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I completely agree. As parents, we all want to be able to have those conversations, but because of the way the algorithms work, we do not see what they see. We say, “Yes, you can download this game, because it has a 4+ rating.” Who knows what a 4+ rating actually means? It has nothing to with the BBFC ratings that we all grew up with and understand really well. Somebody else has decided what is all right and made up the 4+ rating.

For example, Roblox looks as if it is child-ready, but many people might not understand that it is a platform on which anyone can develop a game. Those games can involve grooming children and sexual violence; they are not all about the silly dances that children do in the schoolyard. That platform is inhabited equally by children as it is by adults.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

My hon. Friend does well to draw attention to the gaming world. When most of us think about online threats, we think about social media and messaging, but there are interactive ways of communicating in almost every game in existence, and that can happen across the world.

In Oldham, we have had a number of section 60 stop-and-search orders in place, because of the number of schoolchildren who have been carrying knives and dangerous weapons. Largely, that has been whipped up not in the classroom, but online, overnight, when children are winding each other up and making threats to each other. That has real-life consequences: children have been injured and, unfortunately, killed as a result of carrying weapons in our community. Does my hon. Friend share my concern that this threat is multifaceted, and that the legislation probably should not be so prescriptive for particular platforms at a point in time, but should have founding principles that can be far more agile as new technology comes on stream?

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

My hon. Friend raises two really important points. First, if we try to create legislation to address what companies do today, it will be out of date by the time that it passes through the two Houses. What we do must be done on the basis of principles, and I think a very good starting principle is that what is illegal offline should be illegal online. That is a pretty clear principle. Offline legislation has been robustly challenged over hundreds of years and got us to where we are with our freedom of speech, freedom of expression and freedom to congregate. All those things have been robustly tested by both Houses.

John Slinger Portrait John Slinger
- Hansard - - - Excerpts

On that critical point about the lack of equality between offline and online, does my hon. Friend agree that if I were to go out into the street and staple to somebody’s back an offensive but not illegal statement that was impermeable to being washed off and remained on their back for months, if not years, I would probably be subject to immediate arrest, yet online that happens routinely to our children—indeed, to anyone in society, including politicians? Is that not illustrative of the problem?

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I agree; my hon. Friend makes a very important point about the slander that happens online, the lack of basis in reality and the lack of ability to address it. If somebody posts something about someone else that is untrue, platforms will not take it down; they will say, “It doesn’t breach our terms and conditions.” Somebody could post that I am actually purple and have pink eyes. I would say, “I don’t want you to say that,” and the platform would say, “But there’s nothing offensive about it.” I would say, “But it’s not me.” The thing is that this is happening in much more offensive ways.

My hon. Friend the Member for Oldham West, Chadderton and Royton (Jim McMahon) made the point that what happens online is then repeated offline. We have even seen deaths when children try to replicate the challenges that they see being set online. With AI-generated material, those challenges often are not real. It is the equivalent of somebody trying to repeat magic tricks and dying as a result, which is quite worrying.

The Online Safety Act is not perfect; it needs to go further. The petitioner has made a really important point. The lack of proper definition around small but non-harmful sites versus small but harmful sites is very unclear, and it is really important that the Act provides some clarity on that.

We do not have enough protections for democracy. The Science, Innovation and Technology Committee, which I am a member of, produced a really important report on misinformation and how it led to the riots two summers ago. Misinformation was used as a rallying cry to create unrest across our country of a sort that we had not seen in a very long time. The response from the social media companies was variable; it went from kind of “meh” to really awful. The platforms say, “We don’t police our content. We’re just a platform.” That is naive in the extreme. Quite frankly, they are happy to make money off us, so they should also know that they have to protect us—their customers—just as any other company does, as my hon. Friend the Member for Oldham West, Chadderton and Royton said.

The radicalisation that is happening online is actually shifting the Overton window; we are seeing a more divided country. There is a fantastic book called “Man Up”—it is very academic, but it shows the rise of misogyny leading to the rise of every other form of extremism and how that links back to the online world. If this was all about Islam, this House would be outraged, but because it starts with misogyny, it goes down with a fizzle, and too often people in this House say, “This is all about free speech.” We know that misogyny is the first step on a ladder of radicalisation that leads people to violence—whether into violence against women or further into antisemitism, anti-Islam, anti-anybody who is not the same colour, or anti-anybody who is perceived not to be English from Norman times.

The algorithms provoke violent and shocking content, but they also shadow-ban really important content, such as information on women’s health. Platforms are happy to shadow-ban terms such as “endometriosis” and “tampon”—and God forbid that a tampon commercial should feature red liquid, rather than blue liquid. That content gets shadow-banned and is regularly taken down and taken out of the algorithms, yet the platforms say they can do nothing about people threatening to rape and harm. That is not true; they can, and they choose not to. The public agree that algorithms must be part of the solution; 78% of British parents want to see action on algorithms. My hon. Friends are right that the Online Safety Act and Ofcom could do that, yet they have not done so—they have yet to create transparency in algorithms, which was the Select Committee’s No. 1 recommendation.

[Sir John Hayes in the Chair]

Finally, I want to talk about a few other areas in which we need to move very quickly: deepfakes and AI nudifying apps. We have already seen an example of how deepfakes are being used in British democracy: a deepfake was made of the hon. Member for Mid Norfolk (George Freeman) saying that he is moving from the Conservatives to Reform. It is a very convincing three-minute video. Facebook still refuses to take it down because it does not breach its terms. This should be a warning to us all about how individuals, state actors and non-state actors can impact our local democracy by creating deepfakes of any one of us that we cannot get taken down.

Tom Hayes Portrait Tom Hayes
- Hansard - - - Excerpts

We heard today from the MI6 chief, who talked about how Russia is seeking to “export chaos” into western democracies and said that the UK is one of the most targeted. Does my hon. Friend agree that we need online safety, because it is our national security too, and that as we face the rising threat from Putin and the Kremlin, we need as a country to be secure in the air, at sea, on land and in the digital space?

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I absolutely agree with my hon. Friend. They seek to promote chaos and the destruction of British values, and we need to fight that and protect those values.

The AI nudifying apps, which did not even exist when the Online Safety Act came in, need a very fast response. We know that deepfakes and AI nudifying apps are being used overwhelmingly against classmates and colleagues. Think about how it destroys a 13-year-old girl to have a fake nude photo of her passed around. The abuse that we politicians and many others receive from fake and anonymous accounts needs to be addressed. Seventy-one per cent of British people consider this to be a problem, and we need to take action. AI chatbots are another thing that was not foreseen in the development of the Online Safety Act, and therefore it is far behind on them, too.

The Online Safety Act is in no way perfect, but it is a good step forward. We must learn the lessons of its implementation to go further and faster, and listen to British parents across the country who want the Government’s help to protect our children online—and we as a Government must also protect our democracy online.

17:58
Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Sir John. I congratulate the hon. Member for Sunderland Central (Lewis Atkinson), who made a very eloquent opening speech, and Members from across the Chamber, who have touched on really important matters.

As the hon. Member mentioned, the online space gives us great opportunities for connection and knowledge gathering, but also opportunities for greater harms. What has come across today is that we have addictive algorithms that are pushed in furtherance of commercial and malevolent interests—security interests, for example, although not the security of Great Britain—with no regard for the harm or impact they have on individuals or society.

When it comes to the Online Safety Act, we must get the balance right. Its protections for children and the vulnerable are vital. Of course, it is important to maintain freedom of speech and access to information. The Act is a step in the right direction in protecting children from extreme content, and we have seen changes in pornographic content. However, there are areas where it has not gone far enough, and it is not ready for the changes that are coming at a fast pace. There are websites that serve a public good that are age-gated, and forums for hobbies and communities that are being blocked. As the Liberal Democrats have said, we have to get the balance right. We also have to look at introducing something like a digital Bill of Rights with agile standards in the face of fast-paced changes, to embed safety by design at the base.

The harms that we need to protect children and vulnerable people from online are real. The contributions to this debate from hon. Members from across the House have been, as always, eye-opening and a reminder of how important this issue is. On pornographic content, we heard from the hon. Members for Morecambe and Lunesdale (Lizzi Collinge) and for Milton Keynes Central (Emily Darlington) sickening reminders of the horrific content online that young people see—and not by choice. We must never forget that, as has also been said, people are often not seeking this content, but it comes through, whether on X, which was Twitter, or other platforms. The Molly Rose Foundation highlighted that

“children using TikTok and X were more than twice as likely to have encountered…high risk content compared to users of other platforms.”

The online world coming to life has been mentioned in this debate. One of my constituents in Harpenden wrote to me, horrified that her daughter had been strangled on a dancefloor, because it showed how violent, graphic content is becoming normalised. That struck me to my core. Other content that has been mentioned: suicidal content, violent content and eating disorder misinformation, which the hon. Member for Worcester (Tom Collins) talked so eloquently about. The Molly Rose Foundation also highlighted that one in 10 harmful videos on TikTok have been viewed more than 1 million times, so we have young people seeing that ex content.

Even beyond extreme content, we are starting to see the addictive nature of social media, and the insidious way that this short-form content is becoming such a normalised part of many of our lives. Recent polling by the Liberal Democrats revealed that 80% of parents reported negative behaviours in their child due to excess phone usage, including skipping meals, having difficulty sleeping, or reporting physical discomforts such as eye strain or headaches. Parents and teachers know the real harms that are coming through, but young people themselves do too. I carried out a safer screens tour in my constituency in which I spoke to young people. Many of them said that they are seeing extreme content that they do not want to see, and that, although they have blocked the content, it comes back. The Online Safety Act is helping to change that, but it has not gone far enough. The addictive element of social media is important. In our surveys, two quotes from young people stood out. One sixth-former said that social media is

“as addictive as a drug”,

and that they felt its negative effects every day. Another young person simply wrote, “Help, I can’t stop.” Young people are asking for help and protection; we need to hold social media giants and online spaces to account.

It is welcome that some of those harms have been tackled by the Online Safety Act. On pornography, Pornhub has seen a 77% reduction in visitors to its website; Ofcom has launched 76 investigations into pornography providers and issued one fine of £50,000 for failing to introduce age checks, but we need to ask whether that goes far enough. It has come across loud and clear in this debate that the Online Safety Act has not gone far enough. Analysis has shown that Instagram and TikTok have started to introduce new design features that comply with the Online Safety Act, but game the system to still put forward content that is in those companies’ commercial interests, and not in the interests of young people.

Other extremely important harms include the new harms from AI. Many more people are turning to AI for mental health support. Generative AI is creating graphic content, and the Internet Watch Foundation found that

“reports of AI-generated child sexual abuse material have more than doubled in the past year”

and the IWF says it is at the point where it cannot tell the difference any more—it is horrific.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

The hon. Lady is making a very important point. It really concerns me to see just how desensitised young people or adults can become when they see that type of content, and that inhumane content is directly linked to misogyny and racism. While I know no Member of this House would say such a thing, outside this place I could imagine an argument being made that harm depicted in AI-generated content is not real harm, because the content in itself is not real and no real abuse has been carried out. However, does the hon. Lady share my concern that such content is incredibly harmful, and that there is a real danger that it entraps even more people down the very dark route to what is essentially child abuse and to further types of harm, which will then present in the real world in a way that I do not think even Parliament has yet registered? In a sense, this problem is becoming more and more of a public health crisis.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Absolutely. The insidious part of this issue is the normalisation of such harmful content. In a debate on Lords amendments to the then Data (Use and Access) Bill, on creatives and AI, I mentioned the fact that, in the two weeks since the previous vote, we had seen the release of Google Veo 3—the all-singing, all-dancing video creation software. We are moving so quickly that we do not see how good AI-generated content is becoming. Some content that we see online is probably AI-generated, but we do not realise it. On top of that, as the hon. Gentleman said, AI normalises extreme content and produces content that people think is real, but is not. That is very dangerous for society.

My next point concerns deepfakes, which are undermining trust. Some deepfakes are obvious; some Members of Parliament and news presenters have been targeted through deepfakes. Just as important, however, is the fact that much deepfake content seems normal, but is undermining trust in what we see—we do not know what is real and what is not any more. That is going to be very dangerous not only in terms of extreme content, but for our democracy, and that argument has been made by other Members in this debate.

It is also worrying that social media platforms do not seem to see that problem. To produce its risk assessment report, Ofcom analysed 104 platforms and asked them to put in submissions: not a single social media platform classified itself as high risk for suicide, eating disorder or depression—yet much of what we have heard during this debate, including statistics and anecdotal stories, shows that that is just not true.

On the other hand, while there are areas where the Online Safety Act has not gone far enough, in other areas it has overstepped the mark. When the children’s code came into place, Lord Clement-Jones and I wrote to Secretary of State to outline some of our concerns, including political content being age-gated, educational sites such as Wikipedia being designated as category 1, and important forums about LGBTQ+ rights, sexual health or potentially sensitive topics being age-gated, despite being important for many who are learning about the world.

Jamie from Harpenden, a young person who relies on the internet heavily for education, found when he was looking for resources that a lot of them were flagged as threatening to children and blocked, and felt that that prevented his education. Age assurance systems also pose a problem to data protection and privacy. The intention behind this legislation was never to limit access to political or educational content, and it is important that we support access to the content that many rely on—but we must protect our children and vulnerable people online, and we must get that balance right.

I have a few questions for the Minister. Does he agree with the Liberal Democrats that we should have a cross-party Committee of both Houses of Parliament to review the Online Safety Act? Will he confirm what resources Ofcom has been given? Has analysis been conducted to ensure that Ofcom has enough resources to tackle these issues? What are the Government doing about AI labelling and watermarking? What are they doing to tackle deepfakes? Does the Minister agree that it is time to support the wellbeing of our children, rather than the pockets of big tech? Will the Minister support Liberal Democrat calls to increase the age of data consent and ban social media giants from collecting children’s data to power the addictive algorithms against them? We are calling for public health warnings on addictive social media for under-18s and for a doomscroll cap. Most important is a digital bill of rights and standards that, in light of the fast pace of change, need to be agile.

Our young people deserve better. We need to put children, young people and vulnerable people before the profits of big tech. We will not stop fighting until that change is made.

18:10
Julia Lopez Portrait Julia Lopez (Hornchurch and Upminster) (Con)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Sir John, not least because it means that you cannot speak. I think you would happily take up a good hour of the debate talking about the perils and ills of the internet, and how it needs to be shut down, so that is probably for the best.

John Hayes Portrait Sir John Hayes (in the Chair)
- Hansard - - - Excerpts

That is all true, by the way.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I congratulate the hon. Member for Sunderland Central (Lewis Atkinson) on introducing the debate. He made a particularly excellent contribution to last week’s petition debate on mandatory digital identification; although his party’s leadership may not have thanked him, I am sure his constituents did. He is right that the internet allows unprecedented connection, which is for good, but also for ill. Our job is to balance that inherent tension, while recognising that sometimes there is no balance to be found and that we have to make a choice when it comes to children being served a toxic online diet of extreme content.

When we were in government, that choice was the Online Safety Act, about which thousands of petitioners have raised concerns, believing that its breadth and scope are having too restrictive an effect. I have some sympathy with those concerns, because the Act is large and very complex; although it is proving effective in protecting children in many ways, the implementation undoubtedly comes with challenges, whether that is VPN usage or the inadvertent capturing of no to low-risk sites in compliance duties.

Peter Fortune Portrait Peter Fortune
- Hansard - - - Excerpts

Childnet has discovered that there has been increased downloading of VPNs by children over the last three months, as adolescents use them to circumvent age verification processes. I was interested to hear what the hon. Member for Sunderland Central (Lewis Atkinson) said—I presume he was referring to the Open Rights Group. I just had a quick look at the research, and although it says that it is not the youngest children who do that, it is children from the age of 13 up, so these are vulnerable adolescents. Does my hon. Friend agree that, for the Online Safety Act to be successful, the use of VPNs has to be examined further?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I agree, and I am interested to hear what the Minister has to say about VPNs—whether they should be age-gated, whether we should look at app store controls so that parents have to consent to children downloading VPN apps, or whether there are other, more effective ways of doing that.

The sites that we have talked about may be smaller community forums, or they may be volunteer-run, but they are often mid-size tech companies that do not have the resources that the social media giants have to navigate the legal risks and intricacies of the Act without diverting precious capacity that might otherwise be used to innovate and expand their businesses. I have talked to some of those businesses. They may have one person who can do legal compliance; if they are looking at the next stage of the OSA’s implementation, they may be pulled off other work that is helping to grow the economy. We have to take that very seriously and look into it. A lot of those sites are effectively no risk whatever, and the OSA is probably too burdensome for them. We do not want the Act to stifle our vital tech sector.

Concerns have been raised about freedom of speech and user privacy. I can understand those concerns in principle; particularly when age verification first came into place over the summer, there were instances where there was a practical impact and posts were restricted. However, it seems that those early examples have been recycled many times to suggest that the Act is having a wholesale dampening effect on what people feel confident saying online, and I am not sure that that is actually the case. Those concerns are often conflated with other issues, such as the policing of tweets, non-crime hate incidents, the application of legislation such as the Public Order Act 2023, and outrageous cases of state overreach, such as that of Graham Linehan. We must also be mindful of those who seek to exaggerate those concerns for the sake of big tech’s commercial interests. I note the comments that the hon. Member for Oldham West, Chadderton and Royton (Jim McMahon) made about the power that platforms hold.

On the other side of the ledger are worries that it remains relatively straightforward for under-18s to access pornography. There is no foolproof way to age-gate the internet. We have to see the Online Safety Act as a first step in a pushback against the wholly unacceptable situation that we witnessed previously. Of course some children seek out material and will continue to do so with determination, but far too many had previously been stumbling across explicit or illegal material by complete accident.

Studies that have already been cited today suggest that 41% of children first encountered porn on X, rather than seeking it out. If the Online Safety Act has a material impact on reducing that risk, it will have served its purpose. The hon. Member for Milton Keyneps Central (Emily Darlington) provided some truly shocking statistics and rightly said that we are putting porn back on the top shelf. As she was saying that, I thought back to the advent of magazines such as Zoo and Nuts when I was a teenager. They were seen as having dreadfully explicit content that was far too readily available, but they seem quite quaint when we think about what children can access now.

Pornhub reported a 77% reduction in visitors after implementing age verification measures, but it is important to note that the traffic previously going to such sites has not simply disappeared. A chunk of it has shifted to smaller sites that may be riskier because, in some cases, they think they can financially benefit from not implementing age-gating, which is against the law. Those sites need to know that Ofcom fines are coming. No website should assume that it can sidestep its legal duties. The fines are designed to outweigh any short-term commercial advantage gained by ignoring the law.

I do not believe that the best way of dealing with these concerns is to repeal the Online Safety Act, and nobody in the Chamber has advocated for that, but it is for us to review it and to work out how to tighten up child safety while being honest about the aspects of the law that need to change. A very long and winding road led to the previous Conservative Government passing the Act in 2023, but it was one of the first markers on internet regulation to be put down globally that said that the status quo—children having easy access to illegal explicit content—simply could not continue. As the hon. Member for Morecambe and Lunesdale (Lizzi Collinge) suggested, we cannot have untrammelled freedom for adults in this space, because that status quo caused significant harm to children.

It is right that we now expect more from social media platforms, which were given an opportunity to self-regulate and were found wanting. The Act has driven them to make design changes to help parents, children and teenagers, including Meta’s teen accounts and some aspects of Roblox. Without those protections, more children would encounter harmful content, receive unsolicited contact from adults, and access material that encourages self-harm, eating disorders or even suicide. Families would also continue to face barriers when seeking answers from tech companies following tragedies. Pressure on those companies will continue for as long as we see technology’s pernicious effects on our children, including a chatbot’s recent encouragement of a teenager into suicide.

Despite the criticisms levelled at this law, it remains popular with parents. Parents must remain sovereign in how children are raised, and they must have the parenting confidence to deny phones and social media to their children, but no matter how involved or savvy parents are, they need help with these challenges—challenges that creep into the heart of people’s homes in a way that we have not seen before, and find their way to children through what other children may be sharing with them.

During the years that the Act was being drawn up, we made a lot of changes, including removing the provisions relating to legal but harmful content in 2022. That is good because, as we can see, Ofcom has an enormous job on its hands dealing with some of the biggest online problems, such as age-gating pornography. Had we taken an expansionist approach, we would now be facing far greater problems around free speech, and Ofcom would have an even heavier role—not to mention workload—as arbiter of the public square.

Hon. Members need to think about that carefully: the OSA became a Christmas tree on which everybody hung their niche, individual concerns, and it became unimplementable. If the Labour Government wish to go down a more restrictive route in some of these areas, they have to be mindful of that risk. They need something that can actually be implemented in law and they need to resource the regulator to implement it.

Long before the Act was brought in, the status quo was also having a negative impact on journalistic content. There is lots of discussion about freedom of speech, but I recall having discussions as a media Minister with traditional news content providers that were extremely frustrated by west coast content moderators arbitrarily taking down their content with no opportunity to appeal. With this legislation, we introduced must-carry provisions that would give proper news content providers a greater chance of their content being visible, which is important for journalistic credibility and to make sure that we have truth in online spaces.

As I have said, the Online Safety Act represents the start of a journey as countries grapple to find the right framework through which adults can retain their freedom on the internet while children are treated as children. This debate is particularly timely as a new social media ban was introduced in Australia last week. My party will be watching that closely, but concerns about social media and mobile phones go far beyond the ability to access porn or illegal content.

As a party, the Conservatives are concerned about the impact of social media and smartphone use on children’s mental health, education and social development. I suspect that any hon. Member who has recently been to a school in their constituency will have heard about the challenges to the learning environment, the challenges of the social interactions between children, and the challenges that parents are facing at home, which go way beyond the issue of illegal content. We have also heard other concerns. I liked the way the hon. Member for Worcester (Tom Collins) described it as “a veritable chemistry lab of…psychoactive substances”. He also made some interesting observations about safety and how, as a democracy, we must think very carefully about broader harms.

The Act will be statutorily reviewed next year. I would welcome the Minister telling us whether the Government are examining the measures that have been discussed today, including whether GDPR protections on the processing of children’s data might be raised from age 13 to 16. Age-gating also needs attention. Reports of VPN use suggest that children have been circumventing protections, and we must consider whether age-gating should be applied more comprehensively, including to VPN use or via app stores or at device level to close those loopholes. Adults also need to be assured of the privacy preserving nature of the age verification tools that they are using. There are concerns about the volume of sensitive data being collected.

As I have described, we also need to ensure that low-risk tech firms are not being disproportionately burdened. They did a huge amount of work before the July phase of Ofcom’s introduction of age verification, and they are worried that they will need to make a further step change for regulatory compliance, given the burden that will place on them.

Regulation has to remain proportionate, targeting high-risk sites and services without undermining innovation or our competitive position in tech. We also need to examine, as has been discussed many times today, the emerging technologies such as generative AI and chatbots —as was suggested by the hon. Member for Dewsbury and Batley (Iqbal Mohamed) and the hon. Member for Worcester. The legal position under the OSA is not yet entirely clear, but children are increasingly exposed to AI-generated content, and we need to know if the Act is flexible enough to deal with innovations of that kind.

As hon. Members have described today, the Online Safety Act is a necessary first step, but it is only the beginning of a much longer fight to protect childhood. We have to create the space for childhood and adolescence away from screens, with all the richness and stimulation that the real world can bring. Parents ultimately can never cease their vigilance. The internet cannot be made wholly safe, and we cannot be naive about that. Ultimately, parents have to remain the ultimate backstop to make sure that their children are safe online—but they need help. It is our role as legislators to provide some of those tools and that assistance to help children through their childhoods.

I hope that we look back at the time before the Online Safety Act and wonder how we ever allowed children to be exposed to the unbridled internet culture that has hitherto been the norm. To return to the opening remarks of the hon. Member for Sunderland Central, the internet has led to the creation of new bonds and a huge multiplication of opportunity, but simultaneously the opportunity for harm. Sometimes we need to make a choice, and while the Online Safety Act is undoubtedly imperfect, the imperative to protect children will always take precedence. If social media platforms are held to greater account, so be it.

18:23
Ian Murray Portrait The Minister for Digital Government and Data (Ian Murray)
- Hansard - - - Excerpts

It is great to see you in the Chair, Sir John. I did not realise you were such a technophobe until we heard from the shadow Minister, the hon. Member for Hornchurch and Upminster (Julia Lopez). I am disappointed that you were not able to contribute to this debate. I thank my hon. Friend the Member for Sunderland Central (Lewis Atkinson) for moving the motion on behalf of the Petitions Committee, and I thank him and other speakers for their contributions.

I have not been on the RTG fans message board that my hon. Friend mentioned, but I am sure it has been very busy this weekend. I wondered if some of the trolls mentioned by the hon. Member for Bromley and Biggin Hill (Peter Fortune) were perhaps wearing black and white over the weekend. My hon. Friend the Member for Sunderland Central raised an important point, however: it is the site managers and volunteers who are hosting those forums, keeping them legitimate and working very hard to abide by the law.

Jambos Kickback is an important site for my football team, and many people use it to find out what is going on. It is run by volunteers with no money at all—just for the sheer love of being on the forum together—so I fully understand what the petitioner wants to bring forward. I thank my hon. Friend for the measured way in which he put forward the e-petition. He called for robust, effective and proportionate regulation, which is what the Government are trying to do through the Online Safety Act.

The shadow Minister highlighted that by going through the ledger of the positive and negative issues that the Government face, and indeed that were faced when her party was in government. The one thing on that ledger that is non-negotiable is the safety of children online—I think all hon. Members made that point; in fact, I am disappointed that those who do not make that point are not in this debate to try to win that argument, because I would be very interested to hear what they have to say.

The petition received over 550,000 signatures. Although I appreciate the concerns that it raised, I must reiterate the Government’s very strong response that we have no plans to repeal the Online Safety Act. Parents should know and be confident that their children—I am a father of two young girls, aged five years and ten months—are safe when they access popular online services and that they can benefit from the opportunities that the online world offers. That is why the Government are working closely with Ofcom to implement the Act as quickly and as effectively as possible to enable UK users to benefit from the Act’s protections.

This year, 2025, has been one of significant action on online safety. On 17 March the illegal harms codes of practice came into effect. Those codes will drive significant improvements in online safety in several areas. Services are now required to put in place measures to reduce the risk of their services facilitating illegal content and activity, including terrorism, child sexual abuse and exploitation, and other kinds of illegal activity.

I asked the officials for a list of the priority offences in the Act; there were 17, but that number has increased to 20, with the new Secretary of State at the Department adding some others. It is worth reading through them because it shows the problem and the scale of it. I was really struck by Members who talked about the real world and the online world: if any of these offences were happening in the real world, someone would be carted off to jail immediately rather than being allowed to continue to operate, as they do online.

The priority offences are assisted suicide; threats to kill; public order offences such as harassment, stalking and fear of provocation of violence; drugs and psychoactive substances; firearms and other weapons; assisted illegal immigration; human trafficking; sexual exploitation; sexual images; intimate images of children; proceeds of crime; fraud; financial services fraud; foreign interference; animal welfare; terrorism; and controlling or coercive behaviour. The new ones that have been added by the Secretary of State include self-harm, cyber-flashing and strangulation porn. Do we honestly have to write that into a schedule of an Online Safety Act to say that those things are unacceptable and should not be happening on our computers?

On 25 July, the child safety regime came into force. Services now use highly effective age assurance to prevent children in the UK from encountering pornography and content that encourages, promotes and provides instructions for self-harm, suicide or eating disorders. Platforms are also now legally required to put in place measures to protect children from other types of harmful content, including abusive or hateful content, or bullying and violent content.

When we visited schools, we spoke to headteachers, teachers and parents about the real problem that schools have in trying to deal with the bullying effects of social media. According to Ofcom’s 4 December report that some hon. Members have referenced already, many services now deploy age checks, including the top 10 most popular pornographic sites, the UK’s most popular dating apps and a wide range of other services, including X, Telegram, Reddit, TikTok, Bluesky, Discord, Xbox and Steam. This represents a safer online experience for millions of children across the UK; we have heard that it is already having an impact.

The Government recognise, however, the importance of implementing the duties proportionately. That is why proportionality is a core principle of the Act and is built into many of the duties contained within it. Ofcom’s illegal content and child safety codes of practice set out recommended measures that are tailored to both size and risk to help providers to comply with their obligations —it is really important to emphasise that. When recommending steps that providers can take to comply with their duties, Ofcom must consider the size and risk level of different types and kinds of services.

Let me just concentrate on that for a minute. For instance, Ofcom recommends user blocking and muting measures to help to protect children from harmful content, including bullying, violent content and other harmful materials, and those recommendations are tailored to services’ size and risk profile. Specifically, Ofcom recommends that all services that are high risk for this content need to implement those measures in full. However, for services that are medium risk for this content, Ofcom suggests that they need to implement the measures only if they have more than 700,000 users.

However, while many services carry low risks of harm, risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government are very concerned about small platforms that host the most harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting all small services from duties requiring them to tackle that type of content would mean that those forums would not be subject to the Act’s enforcement powers, which is why we reject the petitioner’s views. Even forums that might seem harmless carry potential risks, such as where adults can engage directly with child users.

The Government recognise the importance of ensuring that low-risk services do not have unnecessary regulatory burdens placed upon them, which I hope reassures the shadow Minister. That is why, in the statement of strategic priorities issued on 2 July, the Government set out our expectation that Ofcom should continue focusing its efforts on safety improvements among services that pose the highest risk of harm to users, including small but risky services. The Government also made it explicitly clear that Ofcom should ensure that expectations on low-risk services are proportionate.

Alongside proportionate implementation of the Act, the Government also understand the need to communicate the new regulations effectively, and to work with companies within its scope to ensure that compliance is as easy as possible. To deliver that, Ofcom is providing support to online service providers of all sizes to make it easier for them to understand and comply with their responsibilities under the UK’s new online safety laws. For example, Ofcom has already launched a regulation checker to help firms to check whether they are covered by the new rules, as well as a number of quick guides for them.

I will address some of the issues raised by Members. My right hon. Friend the Member for Oxford East (Anneliese Dodds) started by raising the issue of pornography and other harmful content. User-to-user services that allow pornographic content, and content that promotes, provides instructions for or encourages suicide, self-harm or eating disorders, must use highly effective age assurance to prevent all children under 18 from accessing that type of content.

Services must take proportionate steps to minimise the risk of children encountering that type of content when using them, and they must also put in place age assurance measures to protect children from harmful content, such as bullying and violent content. Ofcom’s “Protection of Children Codes of Practice” set out what steps services can take to comply, and Ofcom has robust enforcement powers available to use against companies that fail to fulfil those important duties. We are already seeing that enforcement happening, with 6,000 sites having taken action to stop children from seeing harmful content, primarily via age checks. That shows the scale of the issue.

Virtual private networks have also been mentioned by a number of Members, including the shadow Minister. Following the introduction of the child safety duties in July, Ofcom reported that UK daily active users of VPN apps temporarily doubled to around 1.5 million—the average is normally about 750,000. Since then, usage has dropped, falling back down to around 1 million daily users by the end of September. That was expected, and it has also happened in other jurisdictions that have introduced age checks. According to an Ofcom rule, services should

“take appropriate steps to mitigate against methods of circumvention that are easily accessible to children”.

If a provider is not complying with the age assurance duties, by promoting VPN usage to bypass age assurance methods, Ofcom can and should take enforcement action. The use of VPNs does not protect platforms from not complying with the Act itself.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

The Minister has done a huge amount of work on this issue, which I am sure is appreciated by everyone in this House. It cannot be beyond the wit of man to find a way for these VPN companies to bridge between the service user and the ultimate website or platform that they are viewing, so why are VPNs not in scope of the legislation to ensure that they are compliant with the age verification measures? Presumably, it is more difficult for the end website to know the origins of the user, if they have bypassed via a VPN. Surely the onus should be on the VPN company to comply with the law also.

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

My hon. Friend makes a good point; let me come back to him in detail on the VPN issue, as his question relates to what we are planning to do in our review of the Online Safety Act, including both what was written into the legislation and what was not.

My hon. Friend the Member for Darlington (Lola McEvoy), who is no longer in her place, highlighted the really important issue of chatbots, which has also been mentioned by a number of other Members. Generative AI services including chatbots that allow users to share content with one another or search live websites to provide search engines are already regulated under the Online Safety Act. Those services must protect users from illegal content and children from harmful and age-inappropriate content.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Ofcom has said, and my understanding is, that in certain circumstances AI chatbots are covered, but certain new harms—such as emotional dependence—are not. That is an area where the House and many people are asking for clarity.

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

I do not disagree with the hon. Lady. There are a whole host of issues around porn bots and AI-generated bots that have now also sprung up. We know that we are committed to the Online Safety Act and its review as its being implemented. As technology moves on quickly, we have to keep pace with what the harms are and how we are able to deal with them. I thank the hon. Lady for raising those particular issues.

We will act on the evidence that comes forward. It is clear that if the evidence shows us that we have to act in various areas, including chatbots, we will do so. The Secretary of State announced plans to support a child safety summit in 2026, which will bring together tech companies, civil society and young people to shape how AI can benefit children and look at online harms and the movements on those.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I wanted to raise with the Minister that the Science, Innovation and Technology Committee will be undertaking an inquiry in the new year on brain development, addictive use and how that impacts various key points in children’s development. The Minister says that he will look at all evidence. Will he look at the evidence produced by that inquiry to ensure that its information and advice goes to parents across this country?

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

I thank my hon. Friend for the work that she does on that Committee. Of course, the Government have to respond in detail to such reports and we look forward to the recommendations it brings forward. Often we see conspiracy theories in the online world, but there is no conspiracy theory here: the Government are not trying to defend a position against what evidence might come forward.

We have just signed a memorandum of understanding with Australia to look at their experiences of protecting children online and whether there are things that we can do in this country. It has to be evidence-based, and if the evidence base is there, we will certainly make sure to act, because it is non-negotiable that we protect young people and children online.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

I think there is no disagreement on the protection of children and there is no disagreement on what we have legislated to be illegal content. There is more debate needed on harmful but not illegal content and where that line is and what we enforce, and the protections for those who are not children, particularly vulnerable users and those who are being exploited and drawn into some quite extreme behaviours.

I will be honest about where some of these tensions are. How confident will the UK Government be in entering into negotiations on this when we are in the position we are in on trade with the US? The US has also made it clear that it sees any further regulation on social media platforms to be an infringement on trade and freedom of speech. When it comes to making that call, where will the UK Government be?

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

My hon. Friend makes an important point, because freedom of expression is guaranteed in the Act. Although we are regulating to make sure that children and young people are protected online, he is right to suggest that that does not mean we are censoring stuff for adult content. The internet is a place where people can access content if they are age-verified to do so, but it cannot be illegal content. The list of issues in schedule 7 to the Act that I read out at the start of my speech is pretty clear on what someone is not allowed to do online, so any illegal content online still remains illegal. We need to work clearly with the online platforms to make sure that that is not being purveyed through them.

We have seen strong examples of this issue in recent months. If we reflect back to Southport, the public turned to local newspapers—we have discussed this many times before—because they wanted fast and regular but trustworthy news. They turned away from social media channels to get the proper story, and they knew they could trust the local newspaper that they were able to pick up and read. I think the public have a very strong understanding of where we are, but I take the point about people who are not as tech-savvy or are impaired in some way, and so may need further protections. My hon. Friend makes the argument very strongly.

I want to turn to AI chatbots, because they were mentioned in terms of mental health. We are clear that AI must not replace trained professionals. The Government’s 10-year health plan lays foundations for a digital front door for mental health care. Last month, the Secretary of State for Science, Innovation and Technology urged Ofcom to use existing powers to protect children from the potential harms of AI chatbots. She is clear that she is considering what more needs to be done. The Department of Health and Social Care is looking at mental health through the 10-year plan, but the Secretary of State for Science, Innovation and Technology has also been clear that she will not allow AI chatbots to affect young people’s mental health, and will address their development, as mentioned by the Liberal Democrat spokesperson, the hon. Member for Harpenden and Berkhamsted (Victoria Collins).

Let me touch on freedom of expression, because it is important to balance that out. It is on the other side of the shadow Minister’s ledger, and rightly so, because safeguards to protect freedom of expression and privacy are built in throughout the Online Safety Act. Services must consider how to protect users’ rights when applying safety measures, including users’ rights to express themselves freely. Providers do not need to take action on content that is beneficial to children—only against content that poses a risk of harm to children on their services. The Act does not prevent adults from seeking out legal content, and does not restrict people posting legal content that others of opposing views may find offensive. There is no removing of freedom of speech. It is a cornerstone of this Government, and under the Act, platforms have duties to protect freedom of speech. It is written into legislation.

Let me reiterate: the Online Safety Act does not limit freedom of speech. In fact, it protects it. My hon. Friend the Member for Worcester (Tom Collins) was clear when he said in his wonderful speech that making the internet a safe space promotes freedom of speech. Indeed it does, because it allows us to have the confidence that we can use online social media platforms, trust what we are reading and seeing, and know that our children are exposed to age-appropriate content.

I will address age assurance, which was mentioned by the hon. Member for Dewsbury and Batley (Iqbal Mohamed). Ofcom is required to produce a report on the use of age assurance technologies, including the effectiveness of age assurance, due in July 2026—so in seven months’ time. That allows sufficient time for these measures to embed in before considering further action, but the Government continue to monitor the impact of circumvention techniques such as VPNs and the effectiveness of the Act in protecting children. We will not hesitate to go further if necessary, but we are due that report in July 2026, which will be 12 months from the implementation of the measures.

The Liberal Democrat spokesperson asked about reviewing the Act. My previous comments covered some of that, but it is critical that we understand how effective the online safety regime is, and monitoring and evaluating that is key. My Department, Ofcom and the Home Office have developed a framework to monitor the implementation of the Act and evaluate the core outcomes from it.

Tom Collins Portrait Tom Collins
- Hansard - - - Excerpts

The Minister describes the review of the Act and how we have a rapidly growing list of potential harms. It strikes me that we are up against a very agile and rapidly developing world. I recently visited the BBC Blue Room and saw the leading edge of consumer-available technology, and it was quite disturbing to see the capabilities that are coming online soon. In the review of the Act, is there scope to move from a register of harms into perhaps domains of safety, such as trauma, addiction or attachment, where the obligation would be on service providers or manufacturers to ensure their products were safe across those domains? Once again, there could be security for smaller businesses available from the world of technical standards, where if a business is offering a simple service and meets an industry-developed standard, they have presumption of compliance. The British Standards Institution has demonstrated very rapid development of that through the publicly available specification system, and that is available to help us to navigate this rapidly. Could that be in scope?

John Hayes Portrait Sir John Hayes (in the Chair)
- Hansard - - - Excerpts

Interventions should be brief, but I am very kind.

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

Sir John, you are indeed very kind. My hon. Friend gave two examples during his speech. First, he mentioned brakes that were available only for high-end and expensive cars, and are now on all cars. Secondly, he mentioned building regulations, and how we would not build a balcony without a barrier. Those examples seem fairly obvious and almost flippant, but it seems strange that we would regulate heavily to make sure that people are safe physically—nobody would ever argue that it would be a complete disregard of people’s freedom to have a barrier on an 18th-floor balcony—but not online. We do that to keep people safe, and particularly to keep children safe. As my hon. Friend said, if we are keeping adults safe, we are ultimately keeping children safe too.

We have to continue to monitor and evaluate. I was just about to come on to the post-implementation review of the Act, which I am sure my hon. Friend will be very keen to have an input into. The Secretary of State must complete a review of the online safety regime two to five years after part 3 of the Act, which is about duties of care, fully comes into force. The review will therefore be completed no sooner than 2029. These are long timescales, of course, and technology is moving, so I understand the point that he is making. I recall that in the Parliament from 2010 to 2015, we regulated for the telephone, so we move slowly, although we understand that we also have to be nimble to legislate.

The Lib Dem spokesperson, the hon. Member for Harpenden and Berkhamsted, asked whether the Act has gone far enough. Ofcom, the regulator, is taking an iterative approach and will strengthen codes of practice as online harms, technology and the evidence evolve. We are already making improvements, for example strengthening the law to tackle self-harm, cyber-flashing and strangulation. The hon. Lady also asked whether Ofcom has received an increase in resources. It has—Ofcom spending has increased by nearly 30% in the past year, in recognition of its increased responsibilities. She also asked about a digital age of consent. As I mentioned, we have signed a memorandum of understanding with Australia and will engage with Australia to understand its approach. Any action will be based, of course, on robust evidence.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

I would just like to clarify that I made a call for an age of data consent. We put that forward earlier this year as an amendment to the Act. A very first step is to stop social media companies harvesting data and using it to power these addictive algorithms against young people. It is about data consent to 16. Then of course, there is the wider discussion about what is happening with social media in general, but it is that age of data consent that is our first call to action.

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

I take that point about the amendment that the Liberal Democrats tabled.

The hon. Lady also asked for a cross-party Committee to take action. I have already talked about the review of the implementation of the regulations that will happen in July and the other stages after that, as well as the post-implementation review. Of course, setting up a new Committee is a matter for the House. I have no objections to the House setting up Committees to look at these big and important issues that we all care about, if that is what it decides to do.

My hon. Friend the Member for Worcester talked about the issue of Parliament and engagement. He asked whether the Department would engage with the group of academics he mentioned, who are looking at technical safety standards for social media, including considering what role those academics could play in relation to these provisions. I welcome his invitation and I am sure that the Minister responsible for this area—the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Vale of Glamorgan (Kanishka Narayan)—would be delighted to participate in those talks. I am sure that he will be in touch with my hon. Friend the Member for Worcester to take him up on that offer.

We have heard about algorithms, so it is worth focusing concentrating on them. Hon. Friends have talked about the algorithms that serve harmful content. The Government have been clear that algorithms can impact on the risk of harm for children, which is why the legislation comprehensively covers them. The legislation requires providers to consider, via risk assessment, how algorithms could impact children’s exposure to illegal or harmful content, and providers must then take steps to mitigate those risks. If they do not do so, Ofcom has powers that it can use.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

There needs to be a tie-in here with the Cabinet Office and the review of electoral law. If a kind donor in my constituency owned a big billboard and gave me absolute free use of it during an election period, but made an offer to any other party that they could put a Post-it note on the back of it that nobody would see, I would have been expected to declare that as a gift in kind, or a donation in kind. That is not the case with algorithms that are posting and promoting generally right-wing and far-right content during the regulated period. Surely there has to be a better join-up here of election law and online law.

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

This is a huge issue and all of us in this House are very concerned about misinformation and disinformation, and the impact on our democracy. Indeed, I am sure that in the time that I have been speaking here in Westminster Hall, my own social media will have been full of bots and all sorts of other things that try to encourage people to get involved in this debate, in order to influence the algorithm. That can fundamentally disturb our democracy, and is something we are looking at very closely. The Cabinet Office and ourselves are looking at the misinformation and disinformation issue, as is the Department for Culture, Media and Sport in terms of the media outlook and how elections are run in this country. We should all be very clear about not having our democratic processes undermined by such algorithmic platforms that serve up the kind of content that provides misinformation and disinformation to the public.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I appreciate what the Minister says—that these powers are in legislation—yet the process is still the social media platforms marking their own homework. We are in a vicious circle: Ofcom will not take action unless it has a complaint based on evidence, but the evidence is not achievable because the algorithm is not made available for scrutiny. How should Ofcom use those powers more clearly ahead of the elections to ensure that such abuse to our democracy does not occur?

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

A whole host of legislation sits behind this, including through the Electoral Commission and the Online Safety Act, but it is important for us to find ways to ensure that we protect our democratic processes, whether that be from algorithmic serving of content or foreign state actors. It is in the public domain that, when the Iranian servers went dark during the conflict with the US, a third of pro-independence Facebook pages in Scotland went dark, because they were being served by foreign state actors. We have seen that from Russia and various other foreign actors. We have to be clear that the regulations in place need to be implemented and, if they are not, we need to find other ways to ensure that we protect our democracy. At a small tangent, our public sector broadcasters and media companies are a key part of that.

To stay with my hon. Friend the Member for Milton Keynes Central (Emily Darlington), she made an excellent contribution, with figures for what is happening. She asked about end-to-end encryption. We support responsible use of encryption, which is a vital part of our digital world, but the Online Safety Act does not ban any service design such as end-to-end encryption, nor does it require the creation of back doors. However, the implementation of end-to-end encryption in a way that intentionally binds tech companies to content will have a disastrous impact on public safety, in particular for children, and we expect services to think carefully about their design choices and to make the services safe by design for children.

That leads me to online gaming platforms and Roblox, which my hon. Friend also mentioned. Ofcom has asked the main platforms, including Roblox, to share what they are doing and to make improvements where needed. Ofcom will take action if that is not advanced. A whole host of things are happening, and we need the Online Safety Act and the regulations underpinning it to take time to feed through. I hope that we will start to see significant improvements, as reflected on by my hon. Friend the Member for Sunderland Central.

My hon. Friend the Member for Milton Keynes Central mentioned deepfakes. That issue is important to our democracy as well. The Government are concerned about the proliferation of AI-enabled products and services that enable deepfake non-consensual images. In addition to criminalising the creation of non-consensual images, the Government are looking at further options, and we hope to provide an update on that shortly. It is key to protecting not only our wider public online but, fundamentally, those who seek public office.

The Government agree that a safer digital future needs to include small, personally owned and maintained websites. We recognise the importance that proportionate implementation of the Online Safety Act plays in supporting that aim. We can all agree that we need to protect children online, and we would not want low-risk services to have any unnecessary compliance burden. That is a balance that we have to strike to make it proportionate. The Government will conduct a post-implementation review of the Act and will consider the burdens on low-risk services as part of that review, as mentioned in the petition. We will also ensure that the Online Safety Act protects children and is nimble enough to deal with a very fast-moving tech world. I thank all hon. Members for providing a constructive debate and raising their issues. I look forward to engaging further in the months and years ahead.

18:54
Lewis Atkinson Portrait Lewis Atkinson
- Hansard - - - Excerpts

It is a pleasure to see you in the Chair for the conclusion of this debate, Sir John. I thank all Members for their contributions. I think that we had a really constructive and thorough debate, and I certainly learned a lot in the course of it. I only wish that I had heard some of the contributions before I wrote my opening speech. I particularly thank the Minister for being so generous with his time in giving a thorough response and taking interventions, which I think gave us significant insight.

The contributions from the Liberal Democrat spokesperson, the hon. Member for Harpenden and Berkhamsted (Victoria Collins), were thoughtfully made, and the shadow Minister, the hon. Member for Hornchurch and Upminster (Julia Lopez), clearly brought to the debate her expertise on this subject. I think that the points about the wider impacts on mental health and wellbeing of the new online world will be particularly relevant as the Government think more widely about the approach to mental health in their strategy, which I hope will be forthcoming. I can say, as the Minister did, that my only regret is that the few Members of the House who have publicly called for outright repeal of the Online Safety Act were not here this afternoon to give their perspective and to engage with the thoroughness that I think all Members present have engaged.

I have a final couple of reflections. I think that everyone I have heard in this space and debate has been motivated by a desire to preserve rights and our British values, whether it be rights to freedom of speech, freedom of expression or freedom of association, including through small online spaces and forums, but as my hon. Friend the Member for Worcester (Tom Collins) rightly said in his excellent contribution, safe spaces online open up space for further freedoms, and the freedoms of children cannot be infringed by freedoms for adults. It was really shocking to hear the extent of the harms that children are suffering in the current environment. I think that the motivations behind this petition were not about that at all, but it was very much about making sure that the freedoms of association that we hold dear in this country are able to be continued online through small forums. I welcome the Minister’s assurance that the Government see ample space for small and independent providers in the future as part of that.

A reflection that I had in the course of the debate is that increasingly we are talking about safety by design, but a lot of online forums came about in a world in which there was no safety by design. Part of the implementation issue is that the tech being used for online forums is probably 10 or 15 years old in relation to message boards. If a new message board were being set up today, with the use of new tech standards, I would hope that safety by design would be much more embedded, and the responsibilities that fall to individual volunteers and administrators would be lessened as a result.

It is entirely natural that the first attempt at regulation and legislation will not get everything right and that it will require evolution. The Online Safety Act was a landmark attempt to regulate online harms, but I think it is fair to say that the consensus that we have heard today is that it needs to evolve—that we should be looking not to repeal the Act but to evolve at pace and ensure implementation at pace, so that we tackle online harms in a way that is consistent with our British values and the freedoms of expression and association that we have heard about.

It only remains for me to thank again Mr Baynham, the creator of the petition, and all the petitioners for their online engagement—I have to say—in the petition process, without which today’s really informative debate would not have occurred.

Question put and agreed to.

Resolved,

That this House has considered e-petition 722903 relating to the Online Safety Act.

18:58
Sitting adjourned.