(1 day, 12 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is great to see you in the Chair, Sir John. I did not realise you were such a technophobe until we heard from the shadow Minister, the hon. Member for Hornchurch and Upminster (Julia Lopez). I am disappointed that you were not able to contribute to this debate. I thank my hon. Friend the Member for Sunderland Central (Lewis Atkinson) for moving the motion on behalf of the Petitions Committee, and I thank him and other speakers for their contributions.
I have not been on the RTG fans message board that my hon. Friend mentioned, but I am sure it has been very busy this weekend. I wondered if some of the trolls mentioned by the hon. Member for Bromley and Biggin Hill (Peter Fortune) were perhaps wearing black and white over the weekend. My hon. Friend the Member for Sunderland Central raised an important point, however: it is the site managers and volunteers who are hosting those forums, keeping them legitimate and working very hard to abide by the law.
Jambos Kickback is an important site for my football team, and many people use it to find out what is going on. It is run by volunteers with no money at all—just for the sheer love of being on the forum together—so I fully understand what the petitioner wants to bring forward. I thank my hon. Friend for the measured way in which he put forward the e-petition. He called for robust, effective and proportionate regulation, which is what the Government are trying to do through the Online Safety Act.
The shadow Minister highlighted that by going through the ledger of the positive and negative issues that the Government face, and indeed that were faced when her party was in government. The one thing on that ledger that is non-negotiable is the safety of children online—I think all hon. Members made that point; in fact, I am disappointed that those who do not make that point are not in this debate to try to win that argument, because I would be very interested to hear what they have to say.
The petition received over 550,000 signatures. Although I appreciate the concerns that it raised, I must reiterate the Government’s very strong response that we have no plans to repeal the Online Safety Act. Parents should know and be confident that their children—I am a father of two young girls, aged five years and ten months—are safe when they access popular online services and that they can benefit from the opportunities that the online world offers. That is why the Government are working closely with Ofcom to implement the Act as quickly and as effectively as possible to enable UK users to benefit from the Act’s protections.
This year, 2025, has been one of significant action on online safety. On 17 March the illegal harms codes of practice came into effect. Those codes will drive significant improvements in online safety in several areas. Services are now required to put in place measures to reduce the risk of their services facilitating illegal content and activity, including terrorism, child sexual abuse and exploitation, and other kinds of illegal activity.
I asked the officials for a list of the priority offences in the Act; there were 17, but that number has increased to 20, with the new Secretary of State at the Department adding some others. It is worth reading through them because it shows the problem and the scale of it. I was really struck by Members who talked about the real world and the online world: if any of these offences were happening in the real world, someone would be carted off to jail immediately rather than being allowed to continue to operate, as they do online.
The priority offences are assisted suicide; threats to kill; public order offences such as harassment, stalking and fear of provocation of violence; drugs and psychoactive substances; firearms and other weapons; assisted illegal immigration; human trafficking; sexual exploitation; sexual images; intimate images of children; proceeds of crime; fraud; financial services fraud; foreign interference; animal welfare; terrorism; and controlling or coercive behaviour. The new ones that have been added by the Secretary of State include self-harm, cyber-flashing and strangulation porn. Do we honestly have to write that into a schedule of an Online Safety Act to say that those things are unacceptable and should not be happening on our computers?
On 25 July, the child safety regime came into force. Services now use highly effective age assurance to prevent children in the UK from encountering pornography and content that encourages, promotes and provides instructions for self-harm, suicide or eating disorders. Platforms are also now legally required to put in place measures to protect children from other types of harmful content, including abusive or hateful content, or bullying and violent content.
When we visited schools, we spoke to headteachers, teachers and parents about the real problem that schools have in trying to deal with the bullying effects of social media. According to Ofcom’s 4 December report that some hon. Members have referenced already, many services now deploy age checks, including the top 10 most popular pornographic sites, the UK’s most popular dating apps and a wide range of other services, including X, Telegram, Reddit, TikTok, Bluesky, Discord, Xbox and Steam. This represents a safer online experience for millions of children across the UK; we have heard that it is already having an impact.
The Government recognise, however, the importance of implementing the duties proportionately. That is why proportionality is a core principle of the Act and is built into many of the duties contained within it. Ofcom’s illegal content and child safety codes of practice set out recommended measures that are tailored to both size and risk to help providers to comply with their obligations —it is really important to emphasise that. When recommending steps that providers can take to comply with their duties, Ofcom must consider the size and risk level of different types and kinds of services.
Let me just concentrate on that for a minute. For instance, Ofcom recommends user blocking and muting measures to help to protect children from harmful content, including bullying, violent content and other harmful materials, and those recommendations are tailored to services’ size and risk profile. Specifically, Ofcom recommends that all services that are high risk for this content need to implement those measures in full. However, for services that are medium risk for this content, Ofcom suggests that they need to implement the measures only if they have more than 700,000 users.
However, while many services carry low risks of harm, risk assessment duties are key to ensuring that risky services of all sizes do not slip through the net of regulation. For example, the Government are very concerned about small platforms that host the most harmful content, such as forums dedicated to encouraging suicide or self-harm. Exempting all small services from duties requiring them to tackle that type of content would mean that those forums would not be subject to the Act’s enforcement powers, which is why we reject the petitioner’s views. Even forums that might seem harmless carry potential risks, such as where adults can engage directly with child users.
The Government recognise the importance of ensuring that low-risk services do not have unnecessary regulatory burdens placed upon them, which I hope reassures the shadow Minister. That is why, in the statement of strategic priorities issued on 2 July, the Government set out our expectation that Ofcom should continue focusing its efforts on safety improvements among services that pose the highest risk of harm to users, including small but risky services. The Government also made it explicitly clear that Ofcom should ensure that expectations on low-risk services are proportionate.
Alongside proportionate implementation of the Act, the Government also understand the need to communicate the new regulations effectively, and to work with companies within its scope to ensure that compliance is as easy as possible. To deliver that, Ofcom is providing support to online service providers of all sizes to make it easier for them to understand and comply with their responsibilities under the UK’s new online safety laws. For example, Ofcom has already launched a regulation checker to help firms to check whether they are covered by the new rules, as well as a number of quick guides for them.
I will address some of the issues raised by Members. My right hon. Friend the Member for Oxford East (Anneliese Dodds) started by raising the issue of pornography and other harmful content. User-to-user services that allow pornographic content, and content that promotes, provides instructions for or encourages suicide, self-harm or eating disorders, must use highly effective age assurance to prevent all children under 18 from accessing that type of content.
Services must take proportionate steps to minimise the risk of children encountering that type of content when using them, and they must also put in place age assurance measures to protect children from harmful content, such as bullying and violent content. Ofcom’s “Protection of Children Codes of Practice” set out what steps services can take to comply, and Ofcom has robust enforcement powers available to use against companies that fail to fulfil those important duties. We are already seeing that enforcement happening, with 6,000 sites having taken action to stop children from seeing harmful content, primarily via age checks. That shows the scale of the issue.
Virtual private networks have also been mentioned by a number of Members, including the shadow Minister. Following the introduction of the child safety duties in July, Ofcom reported that UK daily active users of VPN apps temporarily doubled to around 1.5 million—the average is normally about 750,000. Since then, usage has dropped, falling back down to around 1 million daily users by the end of September. That was expected, and it has also happened in other jurisdictions that have introduced age checks. According to an Ofcom rule, services should
“take appropriate steps to mitigate against methods of circumvention that are easily accessible to children”.
If a provider is not complying with the age assurance duties, by promoting VPN usage to bypass age assurance methods, Ofcom can and should take enforcement action. The use of VPNs does not protect platforms from not complying with the Act itself.
The Minister has done a huge amount of work on this issue, which I am sure is appreciated by everyone in this House. It cannot be beyond the wit of man to find a way for these VPN companies to bridge between the service user and the ultimate website or platform that they are viewing, so why are VPNs not in scope of the legislation to ensure that they are compliant with the age verification measures? Presumably, it is more difficult for the end website to know the origins of the user, if they have bypassed via a VPN. Surely the onus should be on the VPN company to comply with the law also.
My hon. Friend makes a good point; let me come back to him in detail on the VPN issue, as his question relates to what we are planning to do in our review of the Online Safety Act, including both what was written into the legislation and what was not.
My hon. Friend the Member for Darlington (Lola McEvoy), who is no longer in her place, highlighted the really important issue of chatbots, which has also been mentioned by a number of other Members. Generative AI services including chatbots that allow users to share content with one another or search live websites to provide search engines are already regulated under the Online Safety Act. Those services must protect users from illegal content and children from harmful and age-inappropriate content.
I thank my hon. Friend for the work that she does on that Committee. Of course, the Government have to respond in detail to such reports and we look forward to the recommendations it brings forward. Often we see conspiracy theories in the online world, but there is no conspiracy theory here: the Government are not trying to defend a position against what evidence might come forward.
We have just signed a memorandum of understanding with Australia to look at their experiences of protecting children online and whether there are things that we can do in this country. It has to be evidence-based, and if the evidence base is there, we will certainly make sure to act, because it is non-negotiable that we protect young people and children online.
I think there is no disagreement on the protection of children and there is no disagreement on what we have legislated to be illegal content. There is more debate needed on harmful but not illegal content and where that line is and what we enforce, and the protections for those who are not children, particularly vulnerable users and those who are being exploited and drawn into some quite extreme behaviours.
I will be honest about where some of these tensions are. How confident will the UK Government be in entering into negotiations on this when we are in the position we are in on trade with the US? The US has also made it clear that it sees any further regulation on social media platforms to be an infringement on trade and freedom of speech. When it comes to making that call, where will the UK Government be?
My hon. Friend makes an important point, because freedom of expression is guaranteed in the Act. Although we are regulating to make sure that children and young people are protected online, he is right to suggest that that does not mean we are censoring stuff for adult content. The internet is a place where people can access content if they are age-verified to do so, but it cannot be illegal content. The list of issues in schedule 7 to the Act that I read out at the start of my speech is pretty clear on what someone is not allowed to do online, so any illegal content online still remains illegal. We need to work clearly with the online platforms to make sure that that is not being purveyed through them.
We have seen strong examples of this issue in recent months. If we reflect back to Southport, the public turned to local newspapers—we have discussed this many times before—because they wanted fast and regular but trustworthy news. They turned away from social media channels to get the proper story, and they knew they could trust the local newspaper that they were able to pick up and read. I think the public have a very strong understanding of where we are, but I take the point about people who are not as tech-savvy or are impaired in some way, and so may need further protections. My hon. Friend makes the argument very strongly.
I want to turn to AI chatbots, because they were mentioned in terms of mental health. We are clear that AI must not replace trained professionals. The Government’s 10-year health plan lays foundations for a digital front door for mental health care. Last month, the Secretary of State for Science, Innovation and Technology urged Ofcom to use existing powers to protect children from the potential harms of AI chatbots. She is clear that she is considering what more needs to be done. The Department of Health and Social Care is looking at mental health through the 10-year plan, but the Secretary of State for Science, Innovation and Technology has also been clear that she will not allow AI chatbots to affect young people’s mental health, and will address their development, as mentioned by the Liberal Democrat spokesperson, the hon. Member for Harpenden and Berkhamsted (Victoria Collins).
Let me touch on freedom of expression, because it is important to balance that out. It is on the other side of the shadow Minister’s ledger, and rightly so, because safeguards to protect freedom of expression and privacy are built in throughout the Online Safety Act. Services must consider how to protect users’ rights when applying safety measures, including users’ rights to express themselves freely. Providers do not need to take action on content that is beneficial to children—only against content that poses a risk of harm to children on their services. The Act does not prevent adults from seeking out legal content, and does not restrict people posting legal content that others of opposing views may find offensive. There is no removing of freedom of speech. It is a cornerstone of this Government, and under the Act, platforms have duties to protect freedom of speech. It is written into legislation.
Let me reiterate: the Online Safety Act does not limit freedom of speech. In fact, it protects it. My hon. Friend the Member for Worcester (Tom Collins) was clear when he said in his wonderful speech that making the internet a safe space promotes freedom of speech. Indeed it does, because it allows us to have the confidence that we can use online social media platforms, trust what we are reading and seeing, and know that our children are exposed to age-appropriate content.
I will address age assurance, which was mentioned by the hon. Member for Dewsbury and Batley (Iqbal Mohamed). Ofcom is required to produce a report on the use of age assurance technologies, including the effectiveness of age assurance, due in July 2026—so in seven months’ time. That allows sufficient time for these measures to embed in before considering further action, but the Government continue to monitor the impact of circumvention techniques such as VPNs and the effectiveness of the Act in protecting children. We will not hesitate to go further if necessary, but we are due that report in July 2026, which will be 12 months from the implementation of the measures.
The Liberal Democrat spokesperson asked about reviewing the Act. My previous comments covered some of that, but it is critical that we understand how effective the online safety regime is, and monitoring and evaluating that is key. My Department, Ofcom and the Home Office have developed a framework to monitor the implementation of the Act and evaluate the core outcomes from it.
I take that point about the amendment that the Liberal Democrats tabled.
The hon. Lady also asked for a cross-party Committee to take action. I have already talked about the review of the implementation of the regulations that will happen in July and the other stages after that, as well as the post-implementation review. Of course, setting up a new Committee is a matter for the House. I have no objections to the House setting up Committees to look at these big and important issues that we all care about, if that is what it decides to do.
My hon. Friend the Member for Worcester talked about the issue of Parliament and engagement. He asked whether the Department would engage with the group of academics he mentioned, who are looking at technical safety standards for social media, including considering what role those academics could play in relation to these provisions. I welcome his invitation and I am sure that the Minister responsible for this area—the Under-Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Vale of Glamorgan (Kanishka Narayan)—would be delighted to participate in those talks. I am sure that he will be in touch with my hon. Friend the Member for Worcester to take him up on that offer.
We have heard about algorithms, so it is worth focusing concentrating on them. Hon. Friends have talked about the algorithms that serve harmful content. The Government have been clear that algorithms can impact on the risk of harm for children, which is why the legislation comprehensively covers them. The legislation requires providers to consider, via risk assessment, how algorithms could impact children’s exposure to illegal or harmful content, and providers must then take steps to mitigate those risks. If they do not do so, Ofcom has powers that it can use.
There needs to be a tie-in here with the Cabinet Office and the review of electoral law. If a kind donor in my constituency owned a big billboard and gave me absolute free use of it during an election period, but made an offer to any other party that they could put a Post-it note on the back of it that nobody would see, I would have been expected to declare that as a gift in kind, or a donation in kind. That is not the case with algorithms that are posting and promoting generally right-wing and far-right content during the regulated period. Surely there has to be a better join-up here of election law and online law.
This is a huge issue and all of us in this House are very concerned about misinformation and disinformation, and the impact on our democracy. Indeed, I am sure that in the time that I have been speaking here in Westminster Hall, my own social media will have been full of bots and all sorts of other things that try to encourage people to get involved in this debate, in order to influence the algorithm. That can fundamentally disturb our democracy, and is something we are looking at very closely. The Cabinet Office and ourselves are looking at the misinformation and disinformation issue, as is the Department for Culture, Media and Sport in terms of the media outlook and how elections are run in this country. We should all be very clear about not having our democratic processes undermined by such algorithmic platforms that serve up the kind of content that provides misinformation and disinformation to the public.
(1 week, 1 day ago)
Written CorrectionsI get the Government’s intention, which I strongly support, and I credit the Minister for the work that he is doing, but none of us would accept a member of the public going into a newsagents, taking a newspaper off the rack and walking out without paying for it, yet that is exactly what is taking place with these online giants. They are taking the news off the rack without any payment, commercialising it and making billions in the process. That is what we need to consider. I hear the arguments about whether local authorities should continue with statutory notices—I have a different view; I am not sure that we should hold on to something from the past if it is not adding real value that can be demonstrated from the public investment—but we need to move to a modern way of funding a sustainable local press. Surely that requires a bigger intervention from the Government.
I will come on to that, because the AI copyright issue is a key part of what we are trying to determine. As my hon. Friend will know, under the legislation, the Government are preparing to publish the report and impact assessment required by sections 135 and 136 of the Data (Use and Access) Act 2025. That must be laid before the House by 12 December… That assessment will be reported to the House by Christmas.
[Official Report, 3 December 2025; Vol. 776, c. 388WH.]
Written correction submitted by the Minister for Creative Industries, Media and Arts, the right hon. Member for Edinburgh South (Ian Murray):