Draft Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2025 Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Draft Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2025

Ben Spencer Excerpts
Tuesday 18th November 2025

(1 day, 11 hours ago)

General Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Vickers.

This statutory instrument represents an important development in the obligations on platforms regulated under the Online Safety Act to protect people from encountering illegal content online. The OSA was enacted by the last Government with the primary aim of safeguarding children and removing serious illegal material from the internet. Tackling the most harmful content, such as that which is the subject of today’s discussion, goes to the heart of the Online Safety Act’s aims. His Majesty’s Opposition therefore welcome and support the draft regulations.

The experiences and opportunities offered by the online world change rapidly. It is right that legislators are responsive when new risks emerge or when certain types of unlawful content proliferate on the internet. Under the last Government, the OSA amended the Sexual Offences Act 2003 to criminalise several forms of sexual misconduct and abusive behaviour online. The new offences included cyber-flashing and the sharing of or threatening to share intimate images without consent. The amendments were made to keep pace with novel threats and forms of abuse, the victims of which are too often women and girls.

Baroness Bertin’s independent review of pornography, which was published in February this year, highlighted the damaging impact on victims of intimate image abuse, ranging from physical illness to mental health effects such as anxiety, depression, post-traumatic stress disorder and suicidal thoughts. The effects of cyber-flashing and intimate image abuse on victims is severe. It is therefore right that this statutory instrument brings cyber-flashing within the scope of the priority offences in schedule 7 to the Online Safety Act, while retaining as a priority offence the sharing of or threatening to share intimate images.

We also strongly support the addition as a priority offence of encouraging or assisting serious self-harm, which is the other important component of this statutory instrument. Desperate people who contemplate self-harm need early intervention and support, not encouragement to self-harm. Under this SI, regulated services will be obliged to proactively remove the material when they become aware of it on their platforms and take measures to prevent it from appearing in the first place. One can only wonder why it has taken so long to get to this position. I am sure we will have a unanimous view not only in the House but in society of the importance of removing such material.

The regulations will work only if they are adopted by the industry and subject to rigorous oversight, coupled with enforcement when platforms fail in their obligations. That is a necessity, and why we had to introduce the Online Safety Act in the first place. It is right that Government regulators should look to identify obstacles to the implementation of the OSA and take action where necessary. Since the introduction of Ofcom’s protection of children codes in the summer, important questions have arisen around the use of virtual private networks to circumvent age verification, as well as data security and privacy in the age-verification process.

Peter Fortune Portrait Peter Fortune (Bromley and Biggin Hill) (Con)
- Hansard - - - Excerpts

On that point, does my hon. Friend the shadow Minister agree that we need to give some thought to the rise of chatbots and their nefarious activity, especially where they encourage self-harm or encourage children to do worse?

Ben Spencer Portrait Dr Spencer
- Hansard - -

I thank my hon. Friend for his question on a very important point, which was raised just last week in Department for Science, Innovation and Technology questions by my hon. Friend the Member for Harrow East (Bob Blackman) and others. The Lib Dem spokesperson, the hon. Member for Harpenden and Berkhamsted, also raised questions about the importance of the scope of regulations for chatbots.

The Government seem all over the place as to whether the large language models, as we understand them, regulate the content that comes into scope. Given the response we received last week, it would be helpful to have some clarity from the Minister. Does he believe that LLMs are covered by the OSA when it comes to encouraging self-harm material? If there is a gap, what is he going to do about it? I recognise that he is commissioning Ofcom to look at the issue, but in his view, right now, is there a gap that will need someone to fix it? What are his reflections on that? This is increasingly becoming a priority area that we need to resolve. If there is a gap in legislation, we need to get on and sort it.

--- Later in debate ---
Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I thank Committee members for their valuable contributions to the debate. The update in the regulations will bring us closer to achieving the Government’s commitments to improve online safety and strengthen protection for women and girls online. We believe that updating the priority offences list with the new cyber-flashing and self-harm content offences is the correct, proportionate and evidence-led approach to tackling this type of content, and it will provide stronger protections for online users.

I will now respond to the questions asked in the debate; I thank Members for the tone and substance of their contributions. The shadow Minister, the hon. Member for Runnymede and Weybridge, raised the use of VPNs. As I mentioned previously in the House, apart from an initial spike we have seen a significant levelling-off in the usage of VPNs, which points to the likely effectiveness of the age-assurance measures. We have commissioned further evidence on that front, and I hope to bring that to the House’s attention at the earliest opportunity.

The question of chatbots was raised by the shadow Minister, by the hon. Member for Bromley and Biggin Hill, and by the Liberal Democrat spokesperson, the hon. Member for Harpenden and Berkhamsted. Let me first clarify what I previously mentioned in the House: the legislation covers not only chatbots that allow user-to-user engagement but those that involve one-to-AI engagement and live search. That is extensive coverage of chatbots—both those types are within scope of the Online Safety Act.

There may be further gaps in the Act that pertain to aspects of the risks that Members have raised, and the Secretary of State has commissioned further work to ensure that we keep up with fast-changing technology. A number of the LLMs in question are covered by the Act, given the parameters that I have just defined. Of course, we will continue to review the situation, as both scope and risk need to evolve together.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I hope the Minister takes this in a constructive spirit. Concerns have been raised across the House as to the scope of the OSA when it comes to LLMs and the different types and variations of chatbots, which are being used by many people right now. Is he not concerned that he as the Minister, and his Department, are not able to say at the Dispatch Box whether they believe LLMs are completely covered in the scope of the OSA? Has he received legal advice or other advice? How quickly will he be able to give a definitive response? Clearly, if there is a gap, we need to know about it and we need to take action. It surely puts the regulator and the people who are generating this technology in an invidious position if even Her Majesty’s Government think there is a lack of clarity, as he put it, on the scope of the applicability of the OSA to new technologies.

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

Let me be clear: there is no lack of clarity in the scope of the Bill. It is extremely clear to a provider whether they are in scope or not. If they have user-to-user engagement on the platform, they are in scope. If they have live search, which is the primary basis in respect of many LLMs at the moment, they are in scope. There is no lack of clarity from a provider point of view. The question at stake is whether the further aspects of LLMs, which do not involve any of those areas of scope, pose a particular risk.

A number of incidents have been reported publicly, and I will obviously not comment on individual instances. The Online Safety Act does not focus on individual content-takedown instances and instead looks at a system. Ofcom has engaged firms that are very much in scope of the Act already. If there are further instances of new risks posed by platforms that are not currently within the scope of the Online Safety Act, we will of course review its scope and make sure we are moving fast in the light of that information.

The hon. Member for Harpenden and Berkhamsted asked about child sexual abuse material. I was very proud that we introduced amendments last week to the Crime and Policing Bill to make sure that organisations such as the Internet Watch Foundation are engaged, alongside targeted experts, particularly the police, in spotting CSAM content and risk way before AI models are released. In that context, we are ensuring that the particular risks posed by AI to children’s safety are countered before they escalate.

On the question about Ofcom’s spending and capacity more generally to counter the nature of the risk, the spending cap at Ofcom allows it to enforce against the offences that we deem to be priority offences. In part, when we make the judgment about designating offences as a priority, we make a proportionate assessment about whether we believe there is both severity and the capacity context for robust enforcement. I will continue to review that situation as the nature of the offences changes.

Finally, I am glad that the Government have committed throughout to ensure that sexually explicit non-consensual images, particularly deepfakes, are robustly enforced against. That remains the position. I hope the Committee agrees with me on the importance of updating the priority offences in the Online Safety Act as swiftly as possible. I commend the regulations to the Committee.

Question put and agreed to.