Draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025 Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025

Jeremy Wright Excerpts
Tuesday 4th February 2025

(1 day, 14 hours ago)

General Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- Hansard - -

It is a great and unexpected pleasure to serve under your chairmanship, Sir Christopher. I want to take this opportunity to say something about why I think these regulations are a mistake. I agree with a great deal of what the hon. Member for Aberdeen North (Kirsty Blackman) has just said—I will seek not to repeat it—but it is probably worth noting at the outset that, as the Minister has rightly explained, these regulations are not the only means by which we will hold online services to account under this legislation.

A category 1 designation allows Ofcom—the regulator —to impose additional constraints on a platform. I think that is an entirely fair point to make, but as the hon. Lady observed, something like 100,000 online services are likely to be in scope of this Act overall. It is worth noting that, in Ofcom’s assessment, something like 12 to 16 services only would qualify for category 1 status if, as is currently the case, size was the only criterion and we set the limit—as these regulations seek to do—at 7 million monthly users.

As the hon. Lady explained, over a considerable period of time, with a considerable amount of energy expended, Parliament decided that it was appropriate to include in the category 1 designation not just the largest services, but those services where a great deal of harm may be concentrated but the services are, in themselves, much smaller. Those services being smaller might happen organically, or it might, of course, happen because that harmful content seeks refuge from the regulation applied to the larger services by migrating to smaller ones.

There is good reason, therefore, to think that having smaller services potentially included in category 1 designation is a tool that Ofcom, and indeed the Government, will want to have available.

Those platforms, such as ones that specialise in suicide or self-harm, might well be the kind of platforms that we find ourselves increasingly concerned about and that the Government will increasingly be asked to do something about. I have to say to the Minister that it is not sensible to remove from the regulator’s hand the tools that it might want to use to do what the Government will undoubtedly ask it to do—the Government themselves will come under pressure to do something about that.

Again, as has been explained, what or who we include in that category 1 designation really matters, because of the additional powers and constraints that Ofcom will have available to it in relation to category 1 services. Those powers include the only powers available under this Act to protect adults from anything that is not illegal content—including vulnerable adults, by the way. There will come a time when the Government, I suspect, will wish they had more to deal with problems of that nature. As the hon. Member for Aberdeen North explained, the Act gives those powers, so it is bizarre in the extreme that the Government should choose voluntarily not to use them. It is bizarre, also, because the Labour party in opposition was clear in its support for the change.

The hon. Member for Newton Abbot quoted one example of something that the shadow spokesman at the time, the hon. Member for Pontypridd (Alex Davies-Jones), who now has Government responsibilities elsewhere, said during the passage of the Bill. I will quote another example to the Committee. She said:

“Categorisation of services based on size rather than risk of harm will mean that the Bill will fail to address some of the most extreme harms on the internet.”––[Official Report, Online Safety Public Bill Committee, 12 July 2022; c. 168.]

I think she was absolutely right then, and still is now. The draft regulations, I am afraid, do exactly what she said the Act should not do: they limit the criterion for the designation of category 1, and these additional powers, to size only.

We should think about the Government’s rationale for what they are doing. In December, the Secretary of State made a written statement to set out the reasoning for the measures that the Government have put before the Committee:

“In making these Regulations, I have considered factors as required by the Act. Amendments made during the passage of the Act, changed the consideration for Category 1 from the ‘level of risk of harm to adults from priority content that is harmful to adults disseminated by means of the service’ to ‘how easily, quickly and widely regulated user-generated content is disseminated by means of the service.’ This was a significant change”.—[Official Report, 16 December 2024; Vol. 759, c. 12WS.]

In other words, I think the Secretary of State was arguing that he has no option but to limit to a scale criterion-only designation for category 1, because that is how the Act has changed. That is fundamentally mistaken, if I may say so to the Minister. I do not expect her to have all this before her—I know her officials will take careful note—but the Act states at paragraph 1(5) of schedule 11:

“In making regulations under sub-paragraph (1)”—

the draft regulations we are discussing—

“the Secretary of State must take into account the likely impact of the number of users of the user-to-user part of the service, and its functionalities, on”—

and this is the part the Secretary of State drew out in his statement—

“how easily, quickly and widely regulated user-generated content is disseminated by means of the service.”

Without doubt, therefore, the Secretary of State has to take the number of users into account, but it is not the only criterion. There is a fundamental misunderstanding —at least, I hope that is what it is—in the ministerial statement, which suggests that that is the only criterion to be considered. It is not, and I think it is a mistake to ignore the others, which, again, have already been drawn out in the debate.

To be clear, these draft regulations mean that no smaller platform—under the level of 7 million monthly users—can ever be considered as a category 1 platform, unless or until the Government and Ofcom change their approach to the categorisation process. I repeat the point, and I make no apologies for doing so, that that is specifically contrary to what Parliament had intended in the passage of the Act.

The hon. Member for Aberdeen North and I are not the only ones making this observation. There are multiple organisations with whom we and then the Labour party worked closely to get this Act passed for the protection of those about whom the Labour party is charged with worrying. Those include organisations such as the Samaritans, Mind, the Centre for Countering Digital Hate, the Antisemitism Policy Trust and the Molly Rose Foundation, all of which care deeply about the effectiveness of this legislation, as I am sure we all do.

It is true, and the Minister may make this point, that Ofcom’s advice suggested the course of action the Government are now taking. However, “advice” is the key word. The Government were not obliged to take it, and in this instance I think they would have been wiser to resist it. Ofcom will not have all the tools it could have to deal with smaller services where greater harm may be concentrated, despite what the Act allows. I have to say that tying one hand behind Ofcom’s back is not sensible, even when Ofcom is itself asking us to do so. That is especially true when the Government place such heavy reliance on the Online Safety Act—as they are entitled to—to deal with the multiple online harms that arise.

I have lost count, as I suspect others in this Committee have, of the number of times that Ministers have referred to the Online Safety Act when challenged about harmful materials or behaviours online and said, “This is the answer. This Act gives us powers to act against services that do not do what they should.” They are right that it is not a perfect piece of legislation, and none of us involved in its generation would claim that it was, but it does give Government and regulators the powers to act. However, that does us no good at all if, in subsequent pieces of statutory legislation, the Government choose not to use those tools or put them beyond Ofcom’s reach. That is what the regulations do.

I have to say to the Minister that government is hard enough. She should not throw away the tools she needs to do the job that she has promised everyone that she will do. This is a mistake, and I hope that even at this late stage the Minister will find a way to avoid making it.

--- Later in debate ---
Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

The legislation allows the Secretary of State to deviate from Ofcom’s advice and to publish a statement explaining why. However, the core consideration for category 1 under schedule 11 is—I repeat for the third time—how easily, quickly and widely regulated user-generated content is disseminated by means of a service. As a result, for category 1, Ofcom concluded that the content is disseminated with increased breadth as the number of users increases.

The decision to proceed with the threshold combination recommended by Ofcom, rather than discounting user-number thresholds, reflects that any threshold condition created by the Government should consider the factors as set out in the Act, including easy, quick and wide dissemination for category 1, and the evidence base. That is what the Act says. As a result, the Government decided to not proceed with an approach that deviated from Ofcom’s recommendation, particularly considering the risk of unintended consequences.

I am more than happy to write to the hon. Member for Aberdeen North with the full details. I understand that she feels very passionately about this point, but the Act is the Act. Although I am grateful for her contribution, I have to follow what the Act says, based on the legal advice that I get.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - -

I am extremely grateful to the Minister for giving way, and I have sympathy with her position, especially in relation to legal advice, having both received it and given it. I suggest that the Minister is talking about two different things, and they need to be separated. The first is the question of whether legal but harmful content was removed from the Bill, which it undoubtedly was. Measures in relation to content that is neither unlawful nor harmful to children were largely removed from the Bill—the Minister is right to say that.

What we are discussing, however, are the tools available to Ofcom to deal with those platforms that it is still concerned about in relation to the remaining content within the ambit of the Bill. The worry of those of us who have spoken in the debate is that the Government are about to remove one of the tools that Ofcom would have had to deal with smaller, high-harm platforms when the harm in question remains in ambit of the Bill—not that which was taken out during its passage. Would the Minister accept that?

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

I will again set out what the Secretary of State’s powers are. The Government have considered the suggestion of Baroness Morgan and others to categorise small but risky based on the coroner or Ofcom linking a service to a death. The Government were grateful for that suggestion. However, there were issues with that approach, including with what the Act allows the Secretary of State to consider when setting the categories. The Secretary of State is not allowed to consider anything other than the factors set out in the Act, which says that it has to include easy, quick and wide dissemination for category 1, and has to be evidence based.

I hope that the hon. Member for Aberdeen North will accept that I will write to her in great detail, and include a letter from Government lawyers setting out what I am saying in relation to the powers of the Secretary of State in setting the categories. I hope that she will be satisfied with that. I want to make it clear that we are not taking anything out; the Secretary of State is proceeding with the powers that he has been given.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Thank you, Sir Christopher—I appreciate that prod. I did look at Standing Orders this morning, but could not find that bit, so that is incredibly helpful.

On what the Minister said about schedule 11 and the notes that she has been passed from her team on that point, I appreciate her commitment to share the Government’s legal advice. That will be incredibly helpful; it would have been helpful to have it in advance of this Committee.

In schedule 11, it says:

“In making regulations under sub-paragraph (1), the Secretary of State must take into account the likely impact of the number of users of the user-to-user part of the service, and its functionalities, on how easily, quickly and widely regulated user-generated content is disseminated by means of the service.”

Perhaps I cannot read English, or perhaps the Minister, her legal advisers and the team at DSIT read it in a different way from me, but the Secretary of State having to take something into account and the Secretary of State being bound by something are two different things—they are not the same. It does not say that the Secretary of State must regulate only on the specific number of users.

In fact, schedule 11 says earlier that the Secretary of State

“must make regulations specifying conditions…for the user-to-user part of regulated user-to-user services relating to each of the following”,

which are the

“number of users…functionalities of that part of the service, and…any other characteristics of that part of the service or factors”.

The Secretary of State must therefore make regulations in relation to any other characteristics of that part of the service or factors

“relating to that part of the service that the Secretary of State considers relevant.”

He must do that, but he must only take into account the number of users. The Government, however, have decided that taking into account is much more important than “must” do something. They have decided to do that despite Parliament being pretty clear in the language it has used.

I am not terribly happy with the Online Safety Act. It is a lot better than the situation we have currently, but it is far from perfect. As the Minister said, I argued in favour of keeping the stuff about legal but harmful content for adults. I argued against the then Government’s position on that, but the Act is the Act that we have.

The Minister’s point does not make sense. The Secretary of State has to take into account the number of users and how quickly things are disseminated, but he must make regulations about functionalities or factors that he considers relevant. Therefore, it seems that he does not consider suicide forums and livestreaming to be relevant; if he did, he would surely be bound by the “must” and would have to make regulations about them. It is frustrating that the Act does not do what it is supposed to do and does not protect young people from livestreaming. The Minister said that it protects people from seeing that illegal content, but it does not prevent them from creating it.

The Government could make regulations so that every platform that has a livestreaming functionality, or even every platform that has child users on it—there is a lot in the Act about the proportion of children who use a service—is automatically included in category 1 because they consider them to be high risk.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - -

It would not be right for either of us to ask the Minister to disclose legal advice—that clearly would not be appropriate—but I am grateful for the Minister’s offer to share a slightly more expansive description of why the Government have come to the conclusion that they have.

On the hon. Lady’s point about what the Act actually says, we have both quoted paragraph 1(5) of schedule 11, which deals with whether the language that has found its way into the ministerial statement is the be-all and end-all of the Minister’s conclusions. We both think it is not. If it is the case, as I think the Minister is arguing, that the ability to disseminate “easily, quickly and widely” is essentially a synonym for the scale of the service and the number of its users, what does the hon. Lady think of the amendment that Baroness Morgan made in the other place to paragraph 1(4), which says that when the regulations we are considering specify

“the way or ways in which the relevant conditions are met”,

for category 1 threshold conditions

“at least one specified condition about number of users or functionality must be met”?

The crucial word that was added is “or”. If the number of users were required to establish what the hon. Lady has described, the word “or” would be inappropriate.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I absolutely agree, and that is a helpful clarification.

If the Government have decided that it is too difficult to regulate high-risk platforms as category 1, and that they do not matter enough because they do not have enough of an impact, they should stand up and tell us that. Rather than saying that their hands have been tied by the Act—they manifestly have not—they need to take ownership of their actions. If they have decided that such platforms are not important enough or that they cannot be bothered having a fight with Ofcom about that, they should be honest and say, “This is the position we have decided to take.” Instead, they are standing up and saying, “Our hands have been tied,” but that is just not correct: their hands have not been tied by the Act.

I appreciate that the Minister will get in touch with me about the legal advice, but it will be too late. This statutory instrument will have been through the process by that time, and people will have been put at risk as a result of the Government’s failure. They have the power to take action in relation to functionalities and factors, and in relation to suicide forums, livestreaming and the creation of child sexual abuse material, and they are choosing not to.

If the Government have decided that it is too difficult to do that, that those platforms are not risky enough and that not enough people are being harmed by them, they need to hold their hands up and say, “We’ve decided that this is the position we are going to take.” They must not hide behind the legislation, which does not say what they are telling us it says. They should just be honest about the fact that they have decided that they cannot be bothered to take action. They cannot be bothered to have a fight with Ofcom because it is not important enough. Hiding behind the legislation is incredibly cowardly—it does not say that.