Today, the Government have reached two significant milestones in the implementation of the Online Safety Act (“the Act”), marking an important step forward in creating a safer online environment for all UK citizens. Today, I am laying in Parliament Ofcom’s first draft codes of practice for the illegal content duties and draft regulations setting out the threshold conditions for category 1, 2A and 2B services under the Act.
Ofcom’s draft illegal content duties codes of practice:
The illegal content duties apply to all regulated user-to-user and search services under the Act, no matter their size or reach. These include new duties to have systems and processes in place to tackle illegal content and activity. Ofcom, as the independent regulator for this regime, is required to set out measures in codes of practice that providers can take to fulfil these statutory duties. Ofcom has now submitted to me the drafts of its first codes of practice for the illegal content duties to lay these in Parliament for scrutiny. If neither House objects to the draft codes, Ofcom must issue the codes and the illegal content duties will come into force 21 calendar days later. Once the codes have come into force, the statutory safety duties will begin to apply to service providers, and Ofcom will be able to enforce against non-compliance.
Ofcom has also published its guidance on how providers should carry out risk assessments for illegal content and activity. Providers now have three months to complete their illegal content risk assessment.
The completion of the risk assessments should coincide with the codes of practice coming into force if they pass the statutory laying period. Ofcom’s codes will set out steps service providers can take to address identified risks. The draft codes will drive significant improvements in online safety in several areas. They will ensure service providers put in place effective systems and processes to take down illegal content, including for content that amounts to terrorism, child sexual abuse material (CSAM), public order offences, assisting suicide, intimate image abuse content and other offences. They will make it materially harder for strangers to contact children online, to protect children from grooming. They will significantly expand the number of services that use automated tools to detect CSAM. They will make it significantly easier for the police and the Financial Conduct Authority (FCA) to report fraud and scams to online service providers. And they will make it easier for users to report potentially illegal content.
The draft codes are a vital step in implementing the new regime. Ofcom fully intends to build on these foundations and has announced plans to launch a consultation in spring 2025 on additional measures for the codes. This includes consulting on how automated tools can be used to proactively detect illegal content, including the content most harmful to children, going beyond the automated detection measures that Ofcom have already included. Bringing in the codes will be a key milestone in creating a safer online environment for UK citizens as the duties begin to apply and become enforceable.
Categorisation thresholds:
Services which are ‘categorised’ under the Act will have additional duties placed on them. This is on top of the duties which all regulated user-to-user and search services must comply with to tackle illegal content and, where relevant, to protect children from content that is legal but nonetheless harmful to them. The additional duties will vary depending on whether a service is designated category 1—large user-to-user services—category 2A— large search services—or category 2B—smaller categorised user-to-user services.
In making these regulations, I have considered factors as required by the Act. Amendments made during the passage of the Act, changed the consideration for category 1 from the “level of risk of harm to adults from priority content that is harmful to adults disseminated by means of the service” to “how easily, quickly and widely regulated user-generated content is disseminated by means of the service.” This was a significant change and, while I understand that this approach has its critics who argue that the risk of harm is the more significant factor, this is the position under the Act.
Ofcom advice and the Secretary of State’s (Peter Kyle) decision on threshold conditions
The Act required Ofcom to carry out research within six months of Royal Assent, and to then provide the Secretary of State with advice on the threshold conditions for each of the three categories. This research included a call for evidence so that stakeholder feedback could be considered in Ofcom’s advice.
After considering Ofcom’s advice and subsequent clarificatory information in public letters, I have decided to set threshold conditions for categorisation in accordance with Ofcom’s recommendations. I am satisfied that Ofcom’s advice, which was published in March, is the culmination of an objective, evidence-based process. I have taken this decision in line with the factors set out in schedule 11 of the Act. I have been very clear to date, and want to reiterate, that my priority is the swift implementation of the Act’s duties to create a safer online environment for everyone. I am open to further research in the future and to update thresholds in force if necessary.
I appreciate that there may be some concerns that, at this time, threshold conditions have not been set to capture so-called “small but risky” services by reference to certain functionalities and characteristics or factors. My decision to proceed with the thresholds recommended by Ofcom, rather than to take the approach of discounting user number thresholds, reflects the fact that any threshold condition created by the Government should take into account the factors as set out in the Act, be evidence-based and avoid the risk of unintended consequences.
I also welcome Ofcom’s statement that it is keenly aware that the smallest online services can represent a significant risk to UK citizens, that it has established a dedicated “small but risky” supervision taskforce and that it will use the tools available under the Act to identify, manage and enforce against such services where there is a failure to comply with the duties that all regulated services will be subject to. This includes enforcement powers: to impose penalties on service providers of up to 10% of qualifying worldwide revenue or £18 million—whichever is greater; to require services to take remedial action; and in certain cases, to apply to court for business disruption measures to be taken against service providers.
As Secretary of State, my priority is timely implementation of the Act to ensure that the additional duties are enforceable as soon as possible. Ofcom’s recently updated implementation roadmap sets out the expectation that it aims to publish the register of categorised services in summer 2025 and will launch transparency reporting within a few weeks of publication of the register. This timeline is contingent on the regulations for categorisation thresholds being approved by Parliament without delay.
Proportionality
Many of the additional duties for categorised services have proportionality as a relevant consideration. For example, in determining what is proportionate for the user empowerment content duty, the findings of the most recent user empowerment assessment are relevant which includes the incidence of relevant content on the service, in addition to the size and capacity of a provider. When producing its guidance and codes of practice Ofcom will have regard to the principle of proportionality. In line with Ofcom’s recommendations, we have made it clear in the regulations that services are not captured under category 1 if they use a content recommender system which only recommends to a user their own content.
Threshold conditions
Following Ofcom’s advice and having taken into account matters as required by the Act, I have therefore today laid draft regulations which are intended to give effect to the following threshold conditions for each category of service:
The Category 1 threshold conditions are met by a regulated user-to-user service where, in respect of the user-to-user part of that service, it:
has an average number of monthly active United Kingdom users that exceeds 34 million and uses a content recommender system, OR
has an average number of monthly active United Kingdom users that exceeds 7 million, uses a content recommender system and provides a functionality for users to forward or share regulated user-generated content on the service with other users of that service.
The Category 2A threshold conditions are met by a search engine of a regulated search service or a combined service where it:
has an average number of monthly active United Kingdom users that exceeds 7 million, and
is not a vertical search engine—a search engine which only enables a user to search selected websites or databases in relation to a specific topic, theme or genre of search content.
The Category 2B threshold conditions are met by a regulated user-to-user service where, in respect of the user-to-user part of that service, it:
has an average number of monthly active United Kingdom users that exceeds 3 million and provides a functionality for users to send direct messages to other users of the same service.
[HCWS312]