Draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025 Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025

Ben Obese-Jecty Excerpts
Tuesday 4th February 2025

(1 day, 12 hours ago)

General Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Ben Obese-Jecty Portrait Ben Obese-Jecty (Huntingdon) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Sir Christopher. The Online Safety Act will be one of the lasting accomplish-ments of the last Government. It is world-leading legislation that places significant new responsibilities and duties on social media platforms and search services, to increase safety online. Most importantly, this vital legislation ensures that children are better protected online.

If it is worrying that children aged eight to 17 spend between two and five hours online per day, then it is deeply concerning that half of 13-year-olds reported seeing hardcore, misogynistic pornographic material on social media sites. It is for those reasons that Conservative Ministers ensured that there were the strongest measures in the Online Safety Act to protect children. For example, platforms will be required to prevent children from accessing harmful and age-inappropriate content and will provide parents and children with clear and accessible ways to report problems online when they arise.

Furthermore, the Act requires all in-scope services that allow pornography to use highly effective age assurance to prevent children from accessing it, including services that host user-generated content and services that publish pornography. Ofcom has robust enforcement powers available to use against companies who fail to fulfil their duties. The Act also includes provisions to protect adult users, as it ensures that major platforms are more transparent about what kinds of potentially harmful content they allow. It gives users more control over the types of content they want to see.

The Act allocates regulated services into different categories to ensure that regulatory requirements are applied proportionately. The thresholds that we are debating follow Ofcom’s work and consultation on what platforms should be set as category 1, category 2A and category 2B. The highest-risk platforms—the largest social media and pornography sites—will be designated as category 1 and will bear the highest duty of care. Category 2A will contain the highest-risk search engines, such as Google and Bing, and category 2B will contain the remaining high-risk and high-reach sites.

The regulations enable Ofcom to designate services subject to additional duties. That will address content that promotes, encourages or provides instructions for suicide, self-harm or eating disorders, as well as content that is abusive or incites hate. Where users are likely to access this content, category 1 providers will be required to proactively offer adults optional features to reduce the likelihood of their encountering such content or to alert them to its nature. There are concerns that category 1 sites may omit smaller platforms with harmful content, and it may be prudent for the Government to look at redefining that at a later date.

The Online Safety Act’s impact assessment concludes that more than 25,000 companies may be within scope of the new regulatory framework. Companies designated into higher categories will face additional risks as they face more duties. Can the Minister reassure tech companies, especially small and medium-sized businesses, that her Department will continue to work with them to ensure that cost is affordable and proportionate?

I note that Ofcom expects the illegal harms safety duties to become enforceable around March 2025, once technology companies have assessed the risk of online harms on their platforms. Does the Minister agree that platforms do not need to wait, and should already be taking action to improve safety on their sites? Can the Minister confirm that she is encouraging platforms to take this proactive action?

Separately from the Online Safety Act, the last Government launched the pornography review to explore the effectiveness of regulation, legislation and the law enforcement response to pornography. I understand that that review has now concluded. Can the Minister provide her reassurance that the review’s final report will be published imminently?

I would be grateful for the Minister’s comments on these points. The Online Safety Act is a pivotal piece of legislation and makes the UK the safest place in the world to be a child online. I am proud of the previous Government’s role in passing it, and I urge the Minister to ensure that it is fully implemented as soon as possible.