Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025 Debate

Full Debate: Read Full Debate
Department: Department for Business and Trade

Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025

Baroness Kidron Excerpts
Monday 24th February 2025

(1 day, 16 hours ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for her engagement on this issue, not just with me but with Members across the House. It has been very much appreciated, including when she was not here because she was dealing with her own health issues.

When I talk about what we do here in the House of Lords, one of the great successes I point to is the scrutiny that we gave to the Online Safety Act. We did it in a cross-party way, eventually managing to persuade the Government, as well as Ofcom, about the changes that were needed. Those changes were then taken back to the House of Commons, and Ministers there conceded them. As a result of that working together, we ended up with a much stronger Bill that will do much to protect vulnerable and young people and those most at risk of harmful content online. So it is a matter of great regret that, the first time we are debating a statutory instrument of substantive interest under this Act, we—all of us, I suspect—have to say that we are deeply disappointed by the drafting that we have seen.

On 19 July 2023, I moved a very small amendment and was grateful to the House for its support. I said at the time that one change of one word—from “and” to “or”—made for a small but powerful amendment. The noble Lord, Lord Clement-Jones, set out brilliantly and comprehensively why that change was so important, so in the time available, I will not repeat what he said. The House clearly voted for change and the Minister’s own party supported that change, for which I was deeply grateful.

The other interesting thing is that Ofcom said to me that it did not object to that change. However, in its note today—I am sure that it sent the note to other Members—Ofcom talked about the harms-based approach that it is following when recommending to the Government how they should legislate under the Act. But that harms-based approach rings hollow when—through Ofcom’s interpretation, which it has given to the Government—it has ridden roughshod over looking at the risk of the small but high-harm platforms.

The draft statutory instrument is based on the number of users, and this House in its amendment made it very clear that, with harmful platforms, it is not just about the number of users they have but absolutely about the content, the functionalities and the risks that those sites will raise.

As the noble Baroness set out, Ofcom is relying on paragraph 1(5) of Schedule 11, looking at

“how easily, quickly and widely regulated user-generated content is disseminated by means of the service”.

But that paragraph says that the Secretary of State “must take into account” those things, not that the Secretary of State is bound solely by those criteria. Our criticism tonight of the statutory instrument is not just about the fact that Ofcom has chosen to take those words—I would say that Ofcom in not objecting to my amendment was being disingenuous if it already knew that it was going to rely on that sub-paragraph; the bigger question for the noble Baroness tonight is the fact that the Secretary of State did not have to accept the advice that Ofcom gave them.

The noble Lord, Lord Clement-Jones, talked, as no doubt others will, about the risk and the harm that we have seen from platforms. We will talk about the fact that for the Southport victims it needed only one person to be radicalised by a site that they were looking at to cause untold misery and devastation for families. This House voted recently on the harm caused by deepfake pornographic abuse. Again, it does not take many people to utterly ruin a victim’s life, and what about those platforms that promote suicide and self-harm content? It is not sufficient to say that this Act will impose greater burdens on illegal content. We all know from debates on the Act that there is content which is deliberately not illegal but which is deeply harmful both to victims and to the vulnerable.

As Jeremy Wright MP said in the debate on these regulations in Committee in the House of Commons, the Government are going to want or need these category 1 powers to apply to smaller, high-harm platforms before too long. Indeed, the Government’s own strategic statement published last year specifically says:

“The government would like to see Ofcom keep this approach”—


that is, the approach it has to small, risky services—

“under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online”.

The Government and the Secretary of State already know that there are small but high-harm platforms causing immense risk which will not be caught by these regulations. As we have also heard, the flight therefore to these small, high-harm, risky platforms absolutely will happen as those who want to punt out harmful content seek to find platforms that are not bound by the most stringent regulations.

I will stop there because I know that others wish to speak. I will support the regret amendment tonight should the noble Lord, Lord Clement-Jones, decide to put it to a vote. It has taken far too long to get to this point. I understand the Government’s desire to make progress with these regulations, but the regret amendment states that it

“calls on the Government to withdraw the Regulations and establish a revised definition of Category 1 services”.

I ask the Minister to take that opportunity, because these regulations absolutely do not reflect the will of this House in that amendment. That is a great source of disappointment given the cross-party work that we all did to make sure the Online Safety Act was as comprehensive as it could be.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I remind the House of my interests, particularly as chair of 5Rights and as adviser to the Institute for Ethics in AI at Oxford. I wholeheartedly agree with both the previous speakers, and in fact, they have put the case so forcefully that I hope that the Government are listening.

I wanted to use my time to speak about the gap between the Act that we saw pass through this House and the outcome. What worries me the most is how we should understand the purpose of an Act of Parliament and the hierarchy of the instructions it contains. I ask this because, as the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Morgan, have already said, the Government of the day, with the express support of Members of this House, including the Front Bench of the Labour Party, agreed that categorisation would be a question of risk or size, not simply size. That was the decision of the House, it was supported in the other place, and it is in the text of the Act. So, it would be useful to understand, in the view of His Majesty’s Government, whether the text of an Act and, separately, a statement made by a Minister from the Dispatch Box, have any authority. If they do, I cannot understand how Ofcom is allowed to overturn that, or how the Secretary of State, without taking action to amend the Act, has been able to allow it to do so.

It is essential to get a clear answer from the Minister about the status of the text of the Act, because this is a pattern of behaviour where the regulator and government appear to be cherry-picking which bits of the Online Safety Act are convenient and ignoring those they consider too difficult, too disruptive, or—I really hope not—too onerous for tech companies. Ofcom has similarly determined not to observe the provisions in the OSA about functionalities contained throughout the Act; for example, at Sections 9(5), 10(4) and 11(6)—I could go on; on extended use, at Section 11(6)(f); and on the requirement to consider the needs of children in different age groups which, like functionalities, run through the Act like a golden thread.

Ofcom’s own illegal harms register risk management guidance states that

“certain ‘functionalities’ stand out as posing particular risks because of the prominent role they appear to play in the spread of illegal content and the commission and facilitation of … offences”.

Ofcom then says its regulatory framework is intended to ensure service providers put in place safeguards to manage the risks posed by functionalities. It lists end-to-end encryption, pseudonymity and anonymity, live-streaming, content recommender systems, and, quite rightly, generative AI, all as functionality that it considers to be high risk. Specifically in relation to grooming, functionalities Ofcom considers risky include network expansion prompts, direct messaging, connection lists and automated information displays.

Despite acknowledgement that functionalities create heightened risk, a clear statement that addressing risk forms part of its regulatory duties, and the clearly expressed intent of Parliament and the wording of the Act, Ofcom has failed to comprehensively address functionalities both in the published illegal harms code and the draft children’s code, and it has chosen to overrule Parliament by ignoring the requirement in Schedule 11 to consider functionalities in determining which services should be designated as category 1 services.

Meanwhile, paragraph 4(a)(vii) of Schedule 4 is crystal clear in its objective of the Act that user-to-user services

“be designed and operated in such a way that … the different needs of children at different ages are taken into account”.

Ofcom has chosen to ignore that. Volume 5 of its draft children’s code says

“our proposals focus at this stage on setting the expectation of protections for all children under the age of 18”.

Any child, any parent and anyone who has spent time with children knows that five and 15 are not the same. The assertion from Ofcom in its narrative about the children’s code is blinding in its stupidity. If common sense cannot prevail, perhaps 100 years or more of child development study that sets out the ages and stages by which children can be expected to have the emotional and intellectual capacity to understand something could inform the regulator—and similarly, the age and stage by which we cannot expect a child to understand or have the intellectual capacity to deal with something.

The whole basis of child protection is that we should support the children on their journey from dependence to autonomy because we know that they do not have the capacity to do it for themselves in all contexts, because of the vulnerabilities associated with ages and development stages. Ofcom knows that the Act says that it should reflect this but somehow feels empowered to ignore or overrule the will of Parliament and, just as with categorisation, the Government appear to condone it.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Ofcom’s advice was that how easily, quickly and widely content is disseminated are the key factors that it needed to make the judgment. I cannot say anything more than that.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I am sorry to interrupt, but maybe this would be a good moment to answer my question about the hierarchy of text in an Act versus the regulator’s advice. It was my understanding, when the House agreed to that amendment, that it was an instruction to the regulator rather than something “nice to have” if it decided later that it did not like it.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

The SI before us today, based on Ofcom’s advice, is the best way that we can find, in terms of practicality, of enforcing what was written in the Act.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I can only say what I have already said on this. We are looking at “small but risky”. Ofcom is working hard on this, and we are working hard on this. We can review whether the categorisation process is working. As I have already set out, that option is available to us further down the line. But, at the moment, as with other parts of the Online Safety Act, we felt we needed to get on with it and put these measures into place. Already, the categorisation provisions will take another year or 18 months to come into effect, so it is not as though that is the most imminent part of the implementation of the Act. I hear what noble Lords say. None of these issues are off the table, but we just wanted to get the Act rolled out in as quick and as current a form as we could.

If I could move on, in response to the questions raised by the noble Baroness, Lady Kidron, and the noble Lords, Lord Pannick and Lord Parkinson, I am not able to share the legal advice, but, as I have said, the Secretary of State must act within the legal framework. The current thresholds are legally valid and have been considered by the Joint Committee on Statutory Instruments. In addition to small but risky services, even though in principle there is a provision that allows a user number threshold not to be met, it does not for example allow for sub-delegations to other parties such as coroners, which was another concern of the amendment from the noble Baroness, Lady Morgan.

The decision on the categorisation thresholds has led, as I have just been saying, some to assume that certain small high-risk services are being overlooked by the legislation. However, this is not the case, as they will be subject to the stringent illegal harm and child safety duties. I know that Members are aware that the categorisation of small but risky services would also not prevent or deter users who were determined to access harmful content on dedicated forums. Moreover, the noble Lord, Lord Clement-Jones, raised the question of small but risky services evading the core duties, such as the terms of service and user empowerment. Services that exist solely to host abusive or pro-suicide content, for example, will not have terms of service banning such content, so enforcing those terms would be ineffective in reducing harm.

In addition, the user empowerment tools will enable adult users of category 1 services to avoid certain types of content, such as harmful suicide content. We anticipate that these duties will be most beneficial when services have commercial incentives to prohibit harmful content and where users wish to avoid content they may otherwise see, but not where users are actively seeking out harmful content.

I hope that begins to explain the Secretary of State’s decision. I have to say, and have said, that it was a difficult one and, while we acknowledge the possibility of deviating from Ofcom’s advice and utilising the option to set threshold combinations without a user number, this would not have had the effect of meaningfully reducing harm on small but risky services but would risk regulating hundreds of small low-risk services.

Regarding Ofcom’s small but risky supervisor task force, which the noble Lord, Lord Clement-Jones, asked about, I am confident that Ofcom can effectively use that task force to address these issues. Ofcom already had plans to ensure compliance with the first duties that go live under the Act. These include using targeted enforcement action against small risky services where there is evidence of a significant ongoing risk of harm to users, especially children, and an apparent lack of safety measures in place. In serious cases, Ofcom can seek a court order imposing business disruption measures if there is evidence of continued non-compliance. This could mean asking a third party to withdraw from the service or asking an internet service provider to limit access.

I hope that, as the child safety and illegal content duties come into force this year and the work of the task force begins, those in this House who are concerned will be able to see how these services will not evade their responsibilities under the Act.

Regarding Wikipedia, in response to the questions raised by the noble Lords, Lord Clement-Jones and Lord Moylan, the Government are not in a position to confirm which services will be designated as category 1. Indeed, this is Ofcom’s statutory obligation once the regulations have passed and are in force. It is worth noting that many of the duties on categorised services are subject to the principle of proportionality. This requires Ofcom to consider measures that are technically feasible to providers of a certain size or capacity. Where a code of practice is relevant to a duty, Ofcom must have regard to a principle of proportionality. What is proportionate for one kind of service might not be proportionate for another.

The noble Lords, Lord Clement-Jones and Lord Moylan, also queried how Ofcom could make assessments against the definitions of certain functionalities, characteristics and user number thresholds in the statutory instrument. Once the regulations have been approved by Parliament, Ofcom will issue requests for information and will start assessing services against the threshold conditions.

I also understand that there has been concern that small low-risk platforms, such as local community forums, are being overburdened by the Act and its duties. I must reiterate that these platforms, often run by a small number of users, will not be captured by the categorisation thresholds debated today. At the same time, I acknowledge that the new illegal content and child safety duties will require some additional work from these types of services.

I assure those here today that the principles of proportionality and risk are embedded into the duties on services and Ofcom in relation to the codes of practice. This means that small and low-risk services should not be overburdened by the duties in the Online Safety Act. In efforts to ease the process for small services, Ofcom is providing support to online services to help them to understand their responsibilities under the UK’s new online safety laws. These can be found on Ofcom’s website.

My noble friend Lord Stevenson raised the question of engagement with relevant committees. I agree about the importance of parliamentary scrutiny of the implementation of the Online Safety Act and welcome the expertise Members of both Houses bring. The Government agree that it is vital that regulators are accountable for their services, including through existing annual reports and reporting requirements. We will continue to work with the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee to support their ongoing scrutiny, as well as any other parliamentary committees that may have an interest in the Act. I am more than happy to meet my noble friend Lord Stevenson to discuss how that could be progressed further.

In response to the noble Baroness, Lady Penn, I want to put on record that a letter was shared with the Delegated Legislation and Regulatory Reform Committee in response to concerns raised during the Commons debate.

I must again stress that the Secretary of State will be holding these thresholds and the wider regulatory framework under review going forward and the Government will take whatever action is necessary to tackle risky services of any size.

I would finally like to thank all those who have contributed today: the noble Lords, Lord Clement- Jones, Lord Pannick, Lord Moylan, Lord Stevenson, Lord Russell and Lord Knight, and the noble Baronesses, Lady Morgan, Lady Kidron, Lady Penn—and of course the noble Lord, Lord Parkinson, who continues to put valuable work, expertise and energy into making the UK a safer place, both online and in the material world. I specifically thank user safety groups that have engaged with the Government on this matter and, of course, the noble Lord, Lord Clement-Jones, for his dedication to his work on these issues.

I recognise that there are some who would like to see changes to this instrument and some who believe that the decisions of the Government do not align with the intentions of the Act. I hope they understand that every decision made by this Government is made with the intention of bringing about the Act in an important and timely way. For too long, children and adults in this country have had to grapple with an unsafe online environment, and the instrument that we have debated today shows real progress.

I do not shy away from the challenge we face in navigating the ever-changing online world. I recognise that the Act is imperfect. However, it is not the destination but a significant step in the right direction. There will always be more that we can do. Years of delay and lack of progress have come at an unfathomable cost for vulnerable children and adults, with lives cut short and families’ worlds turned upside down. It is time to deliver change. I hope noble Lords will consider the time pressure and the fact that we have to get on with the rollout of the Act. I urge noble Lords to approve this vital legislation today.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I raised a number of questions and I would be grateful, if the Minister is not going to answer them in the moment, if she could write to me about the Joint Committee, the hierarchy of the Act and statements from the Dispatch Box versus this decision and other decisions.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, if I have not covered any issues, I will of course write to noble Lords to clarify any matters that are outstanding.