Draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025 Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025

Feryal Clark Excerpts
Tuesday 4th February 2025

(1 day, 12 hours ago)

General Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
None Portrait The Chair
- Hansard -

Order. I apologise for the slightly late start.

Feryal Clark Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Feryal Clark)
- Hansard - -

I beg to move,

That the Committee has considered the draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025.

Thank you for coming to save the day, Sir Christopher; it is an honour to serve under your chairmanship. These regulations were laid before Parliament on 16 December 2024. As the Online Safety Act 2023 sets out, the Secretary of State must set thresholds for three categories of service: category 1, category 2A and category 2B. The services that fall into each of those categories will be required to comply with additional duties, with category 1 services having the most duties placed on them. The duties are in addition to the core duties that apply to all user-to-user and search services in scope.

The 2023 Act requires that specific factors must be taken into account by the Secretary of State when deciding thresholds for each category. The threshold conditions for user-to-user services must be set on user numbers and functionalities as well as any other characteristics or factors relating to the user-to-user part of the service that the Secretary of State deems relevant.

For category 1, the key consideration is the likely impact of the number of users of the user-to-user part of the service and its functionalities, on how quickly, easily and widely regulated user-generated content is disseminated by means of the service. For category 2A, the key consideration is the likely impact of the number of users of the search engine on the level of risk of harm to individuals from search content that is illegal or harmful to children. For category 2B, the key consideration is the likely impact of the number of users of the user-to-user part of the service and its functionalities on the level of risk of harm to individuals from illegal content or content that is harmful to children disseminated by means of the service.

Those conditions form the basis of Ofcom’s independent research and advice, as published in March 2024, which the Secretary of State was required to consider when setting threshold conditions. In laying these regulations before Parliament, the Secretary of State has considered the research carried out and the advice from Ofcom and agreed to its recommendations.

I understand that this decision will not please everyone. In particular, I recognise that the thresholds are unlikely to capture so-called “small but risky services”, as per Baroness Morgan’s successful amendment, which made it possible to create a threshold condition by reference only to functionalities and any other factors or characteristics. However, it is important to note that all regulated user-to-user and search services, no matter their size, will be subject to existing illegal content duties and, where relevant, child safety duties. The categories do not change that fact.

If the codes on illegal content duties currently laid before Parliament pass without objection, the duties will be in effect by this spring. They will force services to put in place systems and processes to tackle illegal content. If a service is likely to be accessed by children, the child safety duties will require services to conduct a child safety risk assessment and provide safety measures for child users. We expect that those will come into effect this summer, on the basis that the codes for the duties will have passed by then.

Together, the illegal content and child safety duties will mark the biggest material change in online safety for UK citizens since the internet era began. We expect the Online Safety Act to cover more than 100,000 services of various sizes, showing that the legislation goes far and wide to ensure important protections for users, particularly children, online.

The instrument before us will enable additional duties for categorised services. All categorised services must comply with transparency reporting duties. They must also have terms on the ability of parents to access information about children’s use of a service in the event of a child’s death. Category 1 services will have the most additional requirements. They will have to give adults more choice about the content they see and the people they interact with, and they must protect journalistic and news publisher content and content of democratic importance. The duties will also ensure that we can hold these companies to account over their terms of service, ensuring that they keep the promises they make to their users.

Once in force, the regulations will enable Ofcom to establish a public register of categorised services, which it expects to publish this summer. Ofcom will then consult on the draft codes of practice and guidance where relevant for additional duties. Ofcom will also do additional work to tackle small but risky services.

Ofcom’s work to tackle egregious content and enhance accountability does not stop with this instrument, which takes me back to the small but risky services that I mentioned. The horrifying stories I have heard about these sites during a number of debates recently are truly heartbreaking; we must do everything in our power to prevent vulnerable people from falling victim to such circumstances. I was pleased to see Ofcom set out in September 2024 its targeted approach to tackling small but risky services, which includes a dedicated supervision taskforce and a commitment to move to rapid enforcement action where necessary. That followed a letter from the Secretary of State to Ofcom inquiring about those services.

I am confident that the regulatory framework, combined with the bespoke taskforce, will work to keep all UK citizens safe online, but I must stress that the Secretary of State will hold the thresholds under review going forward. If there is evidence that the categories have become outdated or that they inadequately protect users, he will not shy away from updating them or reviewing the legislation, as he has made clear recently.

Finally, the online world that we are looking to govern is complex and ever-changing. The Act will not solve every problem, but it will bring real benefit to children and adults who have to contend with an unsafe online world for far too long. We should see the instruments we are debating as a step in that process and a first iteration, not as something fixed or set in stone, because there is much more to do. Our foremost priority is the timely implementation of the Act to enforce the additional duties as soon as possible. Years of delay and indecision have already come at a heartbreaking cost for vulnerable children and adults. Now it is time to deliver, but that relies on Parliament approving the categorisation thresholds without delay.

--- Later in debate ---
Feryal Clark Portrait Feryal Clark
- Hansard - -

I thank all Members for their very powerful contributions to the debate. This instrument will bring us one step closer to a safer online world for our citizens. It is clearer than ever that it is desperately needed: transparency, accountability and user empowerment matter now more than ever.

The Opposition spokesperson, the hon. Member for Huntingdon, asked whether we agree on the need for companies not to wait for the duties in the Act to be implemented, but to ensure that safety is baked in from the start. I absolutely agree, and he will be aware that the Secretary of State has made that point on many occasions. He also raised the issue of proportionality. I confirm that many of the duties on categorised services are subject to the principle of proportionality, which requires Ofcom to consider measures that are technically feasible to providers of a certain size or capacity, and in some cases duties are based on the assessment of risk of harm presented by the service.

For example, in determining what is proportionate for the user empowerment duties on content for category 1 services, the findings of the most recent user empowerment assessments are relevant. They include the incidence of relevant content on the service in addition to the size and capacity of the provider. Where a code of practice is relevant to a duty, Ofcom must have regard to the principles on proportionality, and what is proportionate for one kind of service might not be for another.

The hon. Member for Huntingdon is absolutely right that the pornography review has been completed. The Government are reviewing that at the moment and will publish it in due course.

In response to the hon. Members for Newton Abbot and for Aberdeen North (Kirsty Blackman) and to the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), when the Online Safety Act was introduced, category 1 thresholds were due to be assessed based on the level of risk and harm for adults—as the Members read out very clearly. That was removed during the passage of the Bill by the previous Government.

As things stand, although Baroness Morgan’s successful amendment made it possible for threshold conditions to be based solely on functionalities, it did not change the basis of Ofcom’s research, which for category 1 is easy, quick and wide dissemination of content. The Secretary of State had to consider that. I will repeat that for all Members to hear again: the Secretary of State has to act within the powers given to him in schedule 11 when setting out the threshold and conditions. The powers do not allow for thresholds to be determined by another body, as per the amendment.

Although the hon. Member for Aberdeen North very powerfully read out the Act, it very clearly sets out that it does not actually do what she is asking for it to do. We absolutely agree that small but risky sites need to be covered, but as it stands, the Secretary of State does not have the powers to include them.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Will the Minister give way?

Feryal Clark Portrait Feryal Clark
- Hansard - -

Sorry, I have lots of points to cover. If I have not covered the hon Member’s concerns in my response, she is more than welcome to intervene later.

These small but risky services are of significant concern to the Government, and they will still have to protect against illegal content and, where relevant, content that is harmful to children. Ofcom also has a dedicated taskforce to go after them. I hope that answers the hon. Member’s question.

The hon. Member for Newton Abbot also raised the review of Ofcom’s approach. The regulator has already trialled an approach of targeting small but risky services through its regulation of video-sharing platforms. Indeed, a number of those services improved their policies and content moderation in response. All the adult platforms under the VSP regime, large and small, have implemented age verification through this route to ensure that under-18s cannot access pornography on their services. In instances where services fail to make necessary changes, they will face formal enforcement action from Ofcom. Ofcom has a proven track record and the Government have every faith in its ability to take action against non-compliant services.

The hon. Member also raised issues around how Ofcom will enforce action against small but risky services. Ofcom will have robust enforcement powers available to use against companies that fail to fulfil their duties and it will be able to issue enforcement decisions. Action can include fines of up to £18 million or 10% of qualifying worldwide revenue in the relevant year, whichever is higher, and Ofcom can direct companies to take specific steps to comply with its regulation.

Martin Wrigley Portrait Martin Wrigley
- Hansard - - - Excerpts

The Minister raised the issue of age verification, which is good. However, she did not say how “harmful to adults”, “harmful to vulnerable minorities” and “harmful to women” are categorised. Children are protected in this case, but those other groups are not.

Also, in response to the answer that the Minister just gave, the difficulty is not the Ofcom powers; it is the obligation on the provider. If we have not put a provider into category 1, it does not have the same level of obligation as category 1 companies do. No matter what powers Ofcom has and no matter what fines it imposes, it cannot get such companies to give those commitments to a category 1 level if they are not in that category.

Removing the section is not giving Ofcom the tools it needs. The Minister was absolutely right earlier when she said that there is much more to do. Why drop this ability to put other sites in category 1?

Feryal Clark Portrait Feryal Clark
- Hansard - -

I think the hon. Member missed it when I said that, as things stand, the Secretary of State does not have the power to include them. It is not about removing them; it is about not having the powers to include them, as things stand, at the moment.

I will conclude. In extreme cases, Ofcom, with the agreement of the courts, uses business disruption measures, which are court orders that mean third parties have to withdraw non-compliant services, or restrict or block access to non-compliant services in the UK.

The hon. Member for Newton Abbot also asked whether the Act will be reviewed to address the gaps in it. As I said at the start, our immediate focus is getting the Act implemented quickly and effectively. It was designed to tackle illegal content and protect children, and we want those protections in place as soon as possible. It is right that the Government continually assess the ability of the framework to keep us safe, especially given that technology develops so quickly. We will look, of course, at how effective these protections are and build on the Online Safety Act, based on evidence. However, our message to social media companies remains clear: there is no need to wait. As the Opposition spokesperson said, those companies can and should take immediate action to protect their users.

On the use of business disruption measures, the Act provides Ofcom with powers to apply to court for such measures, as I have said, including where there is continued failure and non-compliance. We expect Ofcom to use all available enforcement mechanisms.

The hon. Member for Huntingdon asked how Parliament can scrutinise the delivery of the legislation. Ongoing parliamentary scrutiny is absolutely crucial; indeed, the Online Safety Act requires Ofcom codes to be laid before Parliament for scrutiny. The Science, Innovation and Technology Committee and the Communications and Digital Committee of the House of Lords will play a vital role in scrutinising the regime. Ofcom’s codes of practice for illegal content duties were laid before Parliament in December. Subject to their passing without objection, we expect them to be in force by spring 2025, and the child safety codes are expected to be laid before Parliament in April, in order to be in effect by summer 2025. Under section 178 of the Act, the Secretary of State is required to review the effectiveness of its regulatory framework between two and five years after key provisions of the Act come into force. That will be published as a report and laid before Parliament.

Letters were sent in advance of laying these regulations to the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee. Hon. Members have asked about user numbers. Ofcom recommended the threshold of 34 million or 7 million for category 1. Services must exceed the user number thresholds. The Government are not in a position to confirm who will be categorised. That will be the statutory role of Ofcom once the regulations have passed.

Robin Swann Portrait Robin Swann
- Hansard - - - Excerpts

Will the Minister give way?

Feryal Clark Portrait Feryal Clark
- Hansard - -

I am going to make some progress. On livestreaming, Ofcom considered that functionality, but concluded that the key functionalities that spread content easily, quickly and widely are content recommender systems and forwarding or resharing user-generated content.

Services accessed by children must still be safe by design, regardless of whether they are categorised. Small but risky services will also still be required to comply with illegal content duties. The hon. Member for Aberdeen North should be well aware of that as she raised concerns on that issue.

On child safety, there were questions about how online safety protects children from harmful content. The Act requires all services in scope to proactively remove and prevent users from being exposed to priority illegal content, such as illegal suicide content and child sexual exploitation and abuse material. That is already within the remit.

In addition, companies that are likely to be accessed by children will need to take steps to protect children from harmful content and behaviour on their services, including content that is legal but none the less presents a risk of harm to children. The Act designates content that promotes suicide or self-harm as in the category of primary priority content that is harmful to children. Parents and children will also be able to report pro-suicide or pro-self-harm content to the platform and the reporting mechanism will need to be easy to navigate for child users. On 8 May, Ofcom published its draft children’s safety codes of conduct, in which it proposed measures that companies should employ to protect children from suicide and self-harm content, as well as other content.

Finally, on why category 1 is not based on risk, such as the risk of hate speech, when the Act was introduced, category 1 thresholds were due to be assessed on the level of risk of harm to adults from priority content disseminated by means of that service. As I said earlier, that was removed during the Act’s passage by the then Government and replaced with consideration of the likely functionalities and how easily, quickly and widely user-generated content is disseminated, which is a significant change. Although the Government understand that that approach has its critics, who argue that the risk of harm is the most significant factor, that is the position under the Act.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The Minister is making the case that the Secretary of State’s hands are tied by the Act —that it requires stuff in relation to the number of users. Can she tell us in which part of the Act it says that, because it does not say that? If she can tell us where it is in the Act, I am quite willing to sit down and shut up about this point, but it is not in the Act.

Feryal Clark Portrait Feryal Clark
- Hansard - -

The legislation allows the Secretary of State to deviate from Ofcom’s advice and to publish a statement explaining why. However, the core consideration for category 1 under schedule 11 is—I repeat for the third time—how easily, quickly and widely regulated user-generated content is disseminated by means of a service. As a result, for category 1, Ofcom concluded that the content is disseminated with increased breadth as the number of users increases.

The decision to proceed with the threshold combination recommended by Ofcom, rather than discounting user-number thresholds, reflects that any threshold condition created by the Government should consider the factors as set out in the Act, including easy, quick and wide dissemination for category 1, and the evidence base. That is what the Act says. As a result, the Government decided to not proceed with an approach that deviated from Ofcom’s recommendation, particularly considering the risk of unintended consequences.

I am more than happy to write to the hon. Member for Aberdeen North with the full details. I understand that she feels very passionately about this point, but the Act is the Act. Although I am grateful for her contribution, I have to follow what the Act says, based on the legal advice that I get.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

I am extremely grateful to the Minister for giving way, and I have sympathy with her position, especially in relation to legal advice, having both received it and given it. I suggest that the Minister is talking about two different things, and they need to be separated. The first is the question of whether legal but harmful content was removed from the Bill, which it undoubtedly was. Measures in relation to content that is neither unlawful nor harmful to children were largely removed from the Bill—the Minister is right to say that.

What we are discussing, however, are the tools available to Ofcom to deal with those platforms that it is still concerned about in relation to the remaining content within the ambit of the Bill. The worry of those of us who have spoken in the debate is that the Government are about to remove one of the tools that Ofcom would have had to deal with smaller, high-harm platforms when the harm in question remains in ambit of the Bill—not that which was taken out during its passage. Would the Minister accept that?

Feryal Clark Portrait Feryal Clark
- Hansard - -

I will again set out what the Secretary of State’s powers are. The Government have considered the suggestion of Baroness Morgan and others to categorise small but risky based on the coroner or Ofcom linking a service to a death. The Government were grateful for that suggestion. However, there were issues with that approach, including with what the Act allows the Secretary of State to consider when setting the categories. The Secretary of State is not allowed to consider anything other than the factors set out in the Act, which says that it has to include easy, quick and wide dissemination for category 1, and has to be evidence based.

I hope that the hon. Member for Aberdeen North will accept that I will write to her in great detail, and include a letter from Government lawyers setting out what I am saying in relation to the powers of the Secretary of State in setting the categories. I hope that she will be satisfied with that. I want to make it clear that we are not taking anything out; the Secretary of State is proceeding with the powers that he has been given.

Martin Wrigley Portrait Martin Wrigley
- Hansard - - - Excerpts

Will the Minister give way?

Feryal Clark Portrait Feryal Clark
- Hansard - -

I am going to proceed. I think I have covered the main points raised by hon. Members. I hope that the Committee agrees with me on the importance of enacting these thresholds and implementing the Online Safety Act as swiftly as possible. I made it clear that Ofcom has set up a taskforce that will review the small but risky sites, in response to the Secretary of State’s letter to it in September.

Ashley Fox Portrait Sir Ashley Fox (Bridgwater) (Con)
- Hansard - - - Excerpts

It is an honour to serve under your chairmanship, Sir Christopher. My right hon. and learned Friend the Member for Kenilworth and Southam was Attorney General for four years. It is just possible that his interpretation of the Act is correct, and that of the Minister’s officials is incorrect. I do not have detailed knowledge of this legislation, but I wonder whether the Minister and her Whip want to take some further time and pause before putting these regulations to a vote—that would be perfectly acceptable to us. We will not oppose the regulations, but we are cautious that if the Minister wants more time, she is welcome to take it.

Feryal Clark Portrait Feryal Clark
- Hansard - -

Although I thank the hon. Member for his contribution, I am sure that he will appreciate that this issue has been looked into and discussed in debates and with officials. With that, I commend these regulations to the Committee.

--- Later in debate ---
Feryal Clark Portrait Feryal Clark
- Hansard - -

The comments made by the hon. Member for Aberdeen North are absolutely outrageous, but I would not expect anything less from the SNP. I have made it very clear that I will share legal advice with Members. I also made it clear that the small but risky sites that Members have been talking about were raised by the Secretary of State in a letter to Ofcom in September, and Ofcom has set up a taskforce to look at those services.

The key thing for the Government is to get on with implementing the Online Safety Act. I know that the hon. Lady would like us to spend lots of time delaying, but we are interested in getting on with implementing the Act so that we can keep children safe online. With that, I commend the regulations to the House.

None Portrait The Chair
- Hansard -

For the benefit of people watching, only Committee members can cast votes in a Division.

Question put.