Online Safety Act: Implementation

Ben Obese-Jecty Excerpts
Wednesday 26th February 2025

(1 day, 16 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Ben Obese-Jecty Portrait Ben Obese-Jecty (Huntingdon) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Stringer. I thank my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) for securing this timely debate. His wealth of knowledge on this topic is clear, and his voice in pursuing the most effective iteration of the legislation has been constant.

The previous Government passed the world-leading Online Safety Act, which places significant new responsibilities and duties on social media platforms and search services to increase child safety online—aims that all Members can agree upon. Platforms will be required to prevent children from accessing harmful and age-inappropriate content, and to provide parents and children with clear and accessible ways to report problems online when they arise.

The evidence base showing that social media is adversely impacting our children’s mental health is growing stronger. The Royal Society for Public Health says that about 70% of young people now report that social media increases their feelings of anxiety and depression. It is for those reasons that Conservative Ministers ensured the strongest measures in the Act to protect children.

The Act places duties on online platforms to protect children’s safety and put in place measures to mitigate risks. They will also need to proactively tackle the most harmful illegal content and activity. Once in force, the Act will create a new regulatory regime to significantly improve internet safety, particularly for young people. It will address the rise in harmful content online and will give Ofcom new powers to fulfil the role of the independent regulator. Fundamentally, it will ensure services take responsibility for making their products safe for their users.

I note that the Government have said that they are prioritising work with Ofcom to get the Act implemented swiftly and effectively to deliver a safer online world, but I recognise the concerns of parents and campaigners who worry that children will continue to be exposed to harmful and age-inappropriate content every day until these regulations come into force. Will the Minister acknowledge those concerns in her remarks?

The Act places new duties on certain internet services to protect users from illegal content on their platforms. The purpose of those illegal content duties is to require providers of user-to-user and search services to take more responsibility for protecting UK-based users from illegal content and activity that is facilitated or encountered via their services.

In December, Ofcom published its finalised illegal harms codes of practice and risk assessment guidance. The codes of practice describe the measures that services can take to fulfil their illegal content duties, and they recommend that providers of different kinds and with different capacities take different steps proportionate to their size, capacity and level of risk.

The codes recommend measures in areas including user support, safety by design, additional protections for children and content moderation or de-indexing. Many of the measures in the draft codes are cross-cutting and will help to address all illegal harms. Certain measures are targeted at specific high-priority harms, including child sexual abuse material, terrorism and fraud. Those include measures on automated tools to detect child sexual abuse material and for establishing routes so that the police and the Financial Conduct Authority can report fraud and scams to online service providers. The included measures will also make it easier for users to report potentially illegal content.

Ofcom has also published guidance on how providers should carry out risk assessments for illegal content and activity. Providers now have three months to complete their illegal content risk assessment. Can the Minister update the House on whether the completion of the risk assessments will coincide with the codes of practice coming into force?

Another important milestone was the publication of Ofcom’s children’s access assessment guidance last month. Services will have to assess whether their service is likely to be accessed by children and, once the protection of children codes have been finalised by the summer, must put in place the appropriate protections, known as age assurance duties.

All services that allow pornography must implement by July at the latest highly effective age assurance to ensure that children are not normally able to access pornographic content. Together, the illegal harms and child safety codes should put in place an important foundation for the protection of users. For example, children will be better protected online with services having to introduce robust age checks to prevent children seeing content such as suicide, self-harm material and pornography, and having to tackle harmful algorithms. Illegal content, including hate speech, terrorist content and content that encourages or facilitates suicide should be taken down as soon as services are aware of it. Women and girls will be better protected from misogyny, harassment and abuse online.

The Government have said they are keen for Ofcom to use its enforcement powers as the requirements on services come into effect to make sure that the protections promised by the Act are delivered for users. Samaritans has called on the Government and Ofcom to

“fully harness the power of the Online Safety Act to ensure people are protected from dangerous content”.

Will the Minister confirm that the Government will fully back Ofcom in its enforcement of the illegal harms and child safety codes?

There are concerns that Ofcom appears to be relying on future iterations of the codes to bring in the more robust requirements that would improve safety. Relying on revision of the codes to bring them up to the required standard will likely be a slow process. The requirement to implement initial codes and guidance is significant and is unlikely to allow capacity for revision. Furthermore, the Secretary of State’s ability to stipulate such revisions could hamper that. To that end, it is essential that the first iteration of the codes of practice is robust enough to endure without the need for revision in the short term. Although that might be difficult to achieve in an environment that moves as quickly as the digital space, it must be strived for, lest we end up with legislation that does not hold online platforms to account and does not protect victims of online harms as it should.

As legislators, we have a responsibility to ensure that the online world is a safe place for our children. We also have a responsibility to ensure that online platforms take their obligations seriously. I am pleased that the previous Government’s Online Safety Act delivers on both those points. I urge the Minister to ensure that it is fully implemented as soon as possible.

Graham Stringer Portrait Graham Stringer (in the Chair)
- Hansard - - - Excerpts

We have gained a considerable amount of time because of disciplined interventions and short speeches. I ask the Minister to ensure that there is a small amount of time at the end for the Member in charge to wind up.