Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025

Debate between Baroness Morgan of Cotes and Baroness Kidron
Monday 24th February 2025

(1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Non-Afl)
- View Speech - Hansard - -

My Lords, I thank the Minister for her engagement on this issue, not just with me but with Members across the House. It has been very much appreciated, including when she was not here because she was dealing with her own health issues.

When I talk about what we do here in the House of Lords, one of the great successes I point to is the scrutiny that we gave to the Online Safety Act. We did it in a cross-party way, eventually managing to persuade the Government, as well as Ofcom, about the changes that were needed. Those changes were then taken back to the House of Commons, and Ministers there conceded them. As a result of that working together, we ended up with a much stronger Bill that will do much to protect vulnerable and young people and those most at risk of harmful content online. So it is a matter of great regret that, the first time we are debating a statutory instrument of substantive interest under this Act, we—all of us, I suspect—have to say that we are deeply disappointed by the drafting that we have seen.

On 19 July 2023, I moved a very small amendment and was grateful to the House for its support. I said at the time that one change of one word—from “and” to “or”—made for a small but powerful amendment. The noble Lord, Lord Clement-Jones, set out brilliantly and comprehensively why that change was so important, so in the time available, I will not repeat what he said. The House clearly voted for change and the Minister’s own party supported that change, for which I was deeply grateful.

The other interesting thing is that Ofcom said to me that it did not object to that change. However, in its note today—I am sure that it sent the note to other Members—Ofcom talked about the harms-based approach that it is following when recommending to the Government how they should legislate under the Act. But that harms-based approach rings hollow when—through Ofcom’s interpretation, which it has given to the Government—it has ridden roughshod over looking at the risk of the small but high-harm platforms.

The draft statutory instrument is based on the number of users, and this House in its amendment made it very clear that, with harmful platforms, it is not just about the number of users they have but absolutely about the content, the functionalities and the risks that those sites will raise.

As the noble Baroness set out, Ofcom is relying on paragraph 1(5) of Schedule 11, looking at

“how easily, quickly and widely regulated user-generated content is disseminated by means of the service”.

But that paragraph says that the Secretary of State “must take into account” those things, not that the Secretary of State is bound solely by those criteria. Our criticism tonight of the statutory instrument is not just about the fact that Ofcom has chosen to take those words—I would say that Ofcom in not objecting to my amendment was being disingenuous if it already knew that it was going to rely on that sub-paragraph; the bigger question for the noble Baroness tonight is the fact that the Secretary of State did not have to accept the advice that Ofcom gave them.

The noble Lord, Lord Clement-Jones, talked, as no doubt others will, about the risk and the harm that we have seen from platforms. We will talk about the fact that for the Southport victims it needed only one person to be radicalised by a site that they were looking at to cause untold misery and devastation for families. This House voted recently on the harm caused by deepfake pornographic abuse. Again, it does not take many people to utterly ruin a victim’s life, and what about those platforms that promote suicide and self-harm content? It is not sufficient to say that this Act will impose greater burdens on illegal content. We all know from debates on the Act that there is content which is deliberately not illegal but which is deeply harmful both to victims and to the vulnerable.

As Jeremy Wright MP said in the debate on these regulations in Committee in the House of Commons, the Government are going to want or need these category 1 powers to apply to smaller, high-harm platforms before too long. Indeed, the Government’s own strategic statement published last year specifically says:

“The government would like to see Ofcom keep this approach”—


that is, the approach it has to small, risky services—

“under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online”.

The Government and the Secretary of State already know that there are small but high-harm platforms causing immense risk which will not be caught by these regulations. As we have also heard, the flight therefore to these small, high-harm, risky platforms absolutely will happen as those who want to punt out harmful content seek to find platforms that are not bound by the most stringent regulations.

I will stop there because I know that others wish to speak. I will support the regret amendment tonight should the noble Lord, Lord Clement-Jones, decide to put it to a vote. It has taken far too long to get to this point. I understand the Government’s desire to make progress with these regulations, but the regret amendment states that it

“calls on the Government to withdraw the Regulations and establish a revised definition of Category 1 services”.

I ask the Minister to take that opportunity, because these regulations absolutely do not reflect the will of this House in that amendment. That is a great source of disappointment given the cross-party work that we all did to make sure the Online Safety Act was as comprehensive as it could be.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I remind the House of my interests, particularly as chair of 5Rights and as adviser to the Institute for Ethics in AI at Oxford. I wholeheartedly agree with both the previous speakers, and in fact, they have put the case so forcefully that I hope that the Government are listening.

I wanted to use my time to speak about the gap between the Act that we saw pass through this House and the outcome. What worries me the most is how we should understand the purpose of an Act of Parliament and the hierarchy of the instructions it contains. I ask this because, as the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Morgan, have already said, the Government of the day, with the express support of Members of this House, including the Front Bench of the Labour Party, agreed that categorisation would be a question of risk or size, not simply size. That was the decision of the House, it was supported in the other place, and it is in the text of the Act. So, it would be useful to understand, in the view of His Majesty’s Government, whether the text of an Act and, separately, a statement made by a Minister from the Dispatch Box, have any authority. If they do, I cannot understand how Ofcom is allowed to overturn that, or how the Secretary of State, without taking action to amend the Act, has been able to allow it to do so.

It is essential to get a clear answer from the Minister about the status of the text of the Act, because this is a pattern of behaviour where the regulator and government appear to be cherry-picking which bits of the Online Safety Act are convenient and ignoring those they consider too difficult, too disruptive, or—I really hope not—too onerous for tech companies. Ofcom has similarly determined not to observe the provisions in the OSA about functionalities contained throughout the Act; for example, at Sections 9(5), 10(4) and 11(6)—I could go on; on extended use, at Section 11(6)(f); and on the requirement to consider the needs of children in different age groups which, like functionalities, run through the Act like a golden thread.

Ofcom’s own illegal harms register risk management guidance states that

“certain ‘functionalities’ stand out as posing particular risks because of the prominent role they appear to play in the spread of illegal content and the commission and facilitation of … offences”.

Ofcom then says its regulatory framework is intended to ensure service providers put in place safeguards to manage the risks posed by functionalities. It lists end-to-end encryption, pseudonymity and anonymity, live-streaming, content recommender systems, and, quite rightly, generative AI, all as functionality that it considers to be high risk. Specifically in relation to grooming, functionalities Ofcom considers risky include network expansion prompts, direct messaging, connection lists and automated information displays.

Despite acknowledgement that functionalities create heightened risk, a clear statement that addressing risk forms part of its regulatory duties, and the clearly expressed intent of Parliament and the wording of the Act, Ofcom has failed to comprehensively address functionalities both in the published illegal harms code and the draft children’s code, and it has chosen to overrule Parliament by ignoring the requirement in Schedule 11 to consider functionalities in determining which services should be designated as category 1 services.

Meanwhile, paragraph 4(a)(vii) of Schedule 4 is crystal clear in its objective of the Act that user-to-user services

“be designed and operated in such a way that … the different needs of children at different ages are taken into account”.

Ofcom has chosen to ignore that. Volume 5 of its draft children’s code says

“our proposals focus at this stage on setting the expectation of protections for all children under the age of 18”.

Any child, any parent and anyone who has spent time with children knows that five and 15 are not the same. The assertion from Ofcom in its narrative about the children’s code is blinding in its stupidity. If common sense cannot prevail, perhaps 100 years or more of child development study that sets out the ages and stages by which children can be expected to have the emotional and intellectual capacity to understand something could inform the regulator—and similarly, the age and stage by which we cannot expect a child to understand or have the intellectual capacity to deal with something.

The whole basis of child protection is that we should support the children on their journey from dependence to autonomy because we know that they do not have the capacity to do it for themselves in all contexts, because of the vulnerabilities associated with ages and development stages. Ofcom knows that the Act says that it should reflect this but somehow feels empowered to ignore or overrule the will of Parliament and, just as with categorisation, the Government appear to condone it.