Online Safety Act: Implementation

Graham Stringer Excerpts
Wednesday 26th February 2025

(1 day, 12 hours ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

The hon. Gentleman identifies a real risk in this space: we are always playing catch-up, and so are the regulators. That is why we have tried—perhaps not entirely successfully—to design legislation that gives the regulators the capacity to move faster, but we have to ask them to do so and they have to take responsibility for that. I am raising these points because I am concerned that this particular regulator in this particular set of circumstances is not being as fleet of foot as it could be, but the hon. Gentleman is right that this is a concern across the regulatory piece. I would also say that regulators are not the only actor. We might expect the Government to pick up this issue and ensure that regulators do what Parliament expects, but in this area the signs are not encouraging.

As some Members in Westminster Hall this morning know because they were present during the debates on it, elsewhere in the Online Safety Act there is provision to bring forward secondary legislation to determine how online services are categorised, with category 1 services being subject to additional duties and expectations. That process was discussed extensively during the passage of the Act, and an amendment was made to it in the other place to ensure that smaller platforms with high incidences of harmful content could be included in category 1, along with larger platforms. That is an important change, because some of the harm that we are most concerned about may appear on smaller specialist platforms, or may go there to hide from the regulation of larger platforms. The previous Government accepted that amendment in this House, and the current Government actively supported it in opposition.

I am afraid, however, that Ofcom has now advised the Government to disregard that change, and the Government accepted that advice and brought a statutory instrument to Committee on 4 February that blatantly contravenes the will of Parliament and the content of primary legislation. It was a clear test case of the Government’s willingness to defend the ambition of the Online Safety Act, and I am afraid they showed no willingness to do so.

If we cannot rely on the Government to protect the extent of the Act—perhaps we should not, because regulatory independence from the Executive is important—who should do it? I am sure the Minister will say in due course that it falls within the remit of the Science, Innovation and Technology Committee. I mean no disrespect to that Committee, but it has a lot on its plate already and supervision of the fast-moving world of online safety regulation is a big job in itself. It is not, by the way, the only such job that needs doing. We have passed, or are in the process of passing, several other pieces of similar framework legislation in this area, including the Digital Markets, Competition and Consumers Act 2024, the Data (Use and Access) Bill and the Media Act 2024, all of which focus on regulators’ power to act and on the Secretary of State’s power to direct them. Parliament should have the means to oversee how that legislation is being implemented too.

Many of these areas overlap, of course, as regulators have recognised. They established the Digital Regulation Co-operation Forum to deal with the existing need to collaborate, which of course is only likely to grow with the pervasive development of artificial intelligence. Surely we should think about parliamentary oversight along the same lines. That is why I am not the first, nor the only, parliamentarian to be in favour of a new parliamentary Committee—preferably a Joint Committee, so that the expertise of many in the other place can be utilised—to scrutinise digital legislation. The Government have set their face against that idea so far, but I hope they will reconsider.

My final point is that there is urgency. The children’s safety codes will be finalised within weeks, and will set the tone for how ambitious and innovative—or otherwise—online services will be in keeping our children safe online. We should want the highest possible ambition, not a reinforcement of the status quo. Ofcom will say, and has said, that it can always do more in future iterations of the codes, but realistically the first version will stand for years before it is revised, and there will be many missed opportunities to make a child’s online world safer in that time. It is even less likely that new primary legislation will come along to plug any gaps anytime soon.

As the responsible Secretary of State, I signed off the online harms White Paper in 2019. Here we are in 2025, and the Online Safety Act is still not yet fully in force. We must do the most we can with the legislation we have, and I fear that we are not.

Given the efforts that were made all across the House and well beyond it to deliver the best possible set of legislative powers in this vital area, timidity and lack of ambition on the part of Ministers or regulators—leading to a pulling back from the borders of this Act—is not just a challenge to parliamentary sovereignty but, much more importantly, a dereliction of duty to the vulnerable members of our society, whose online safety is our collective responsibility. There is still time to be braver and ensure that the Online Safety Act fulfils its potential. That is what Ofcom and the Government need to do.

Graham Stringer Portrait Graham Stringer (in the Chair)
- Hansard - -

I remind hon. and right hon. Members to bob if they wish to speak. I intend to call the Front-Bench spokespeople at half-past 10 so I will impose a four-minute limit on speeches. That gives very little scope for interventions though it is up to hon. Members whether to take them, but I may have to reduce the time limit.

--- Later in debate ---
Ben Obese-Jecty Portrait Ben Obese-Jecty (Huntingdon) (Con)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Stringer. I thank my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) for securing this timely debate. His wealth of knowledge on this topic is clear, and his voice in pursuing the most effective iteration of the legislation has been constant.

The previous Government passed the world-leading Online Safety Act, which places significant new responsibilities and duties on social media platforms and search services to increase child safety online—aims that all Members can agree upon. Platforms will be required to prevent children from accessing harmful and age-inappropriate content, and to provide parents and children with clear and accessible ways to report problems online when they arise.

The evidence base showing that social media is adversely impacting our children’s mental health is growing stronger. The Royal Society for Public Health says that about 70% of young people now report that social media increases their feelings of anxiety and depression. It is for those reasons that Conservative Ministers ensured the strongest measures in the Act to protect children.

The Act places duties on online platforms to protect children’s safety and put in place measures to mitigate risks. They will also need to proactively tackle the most harmful illegal content and activity. Once in force, the Act will create a new regulatory regime to significantly improve internet safety, particularly for young people. It will address the rise in harmful content online and will give Ofcom new powers to fulfil the role of the independent regulator. Fundamentally, it will ensure services take responsibility for making their products safe for their users.

I note that the Government have said that they are prioritising work with Ofcom to get the Act implemented swiftly and effectively to deliver a safer online world, but I recognise the concerns of parents and campaigners who worry that children will continue to be exposed to harmful and age-inappropriate content every day until these regulations come into force. Will the Minister acknowledge those concerns in her remarks?

The Act places new duties on certain internet services to protect users from illegal content on their platforms. The purpose of those illegal content duties is to require providers of user-to-user and search services to take more responsibility for protecting UK-based users from illegal content and activity that is facilitated or encountered via their services.

In December, Ofcom published its finalised illegal harms codes of practice and risk assessment guidance. The codes of practice describe the measures that services can take to fulfil their illegal content duties, and they recommend that providers of different kinds and with different capacities take different steps proportionate to their size, capacity and level of risk.

The codes recommend measures in areas including user support, safety by design, additional protections for children and content moderation or de-indexing. Many of the measures in the draft codes are cross-cutting and will help to address all illegal harms. Certain measures are targeted at specific high-priority harms, including child sexual abuse material, terrorism and fraud. Those include measures on automated tools to detect child sexual abuse material and for establishing routes so that the police and the Financial Conduct Authority can report fraud and scams to online service providers. The included measures will also make it easier for users to report potentially illegal content.

Ofcom has also published guidance on how providers should carry out risk assessments for illegal content and activity. Providers now have three months to complete their illegal content risk assessment. Can the Minister update the House on whether the completion of the risk assessments will coincide with the codes of practice coming into force?

Another important milestone was the publication of Ofcom’s children’s access assessment guidance last month. Services will have to assess whether their service is likely to be accessed by children and, once the protection of children codes have been finalised by the summer, must put in place the appropriate protections, known as age assurance duties.

All services that allow pornography must implement by July at the latest highly effective age assurance to ensure that children are not normally able to access pornographic content. Together, the illegal harms and child safety codes should put in place an important foundation for the protection of users. For example, children will be better protected online with services having to introduce robust age checks to prevent children seeing content such as suicide, self-harm material and pornography, and having to tackle harmful algorithms. Illegal content, including hate speech, terrorist content and content that encourages or facilitates suicide should be taken down as soon as services are aware of it. Women and girls will be better protected from misogyny, harassment and abuse online.

The Government have said they are keen for Ofcom to use its enforcement powers as the requirements on services come into effect to make sure that the protections promised by the Act are delivered for users. Samaritans has called on the Government and Ofcom to

“fully harness the power of the Online Safety Act to ensure people are protected from dangerous content”.

Will the Minister confirm that the Government will fully back Ofcom in its enforcement of the illegal harms and child safety codes?

There are concerns that Ofcom appears to be relying on future iterations of the codes to bring in the more robust requirements that would improve safety. Relying on revision of the codes to bring them up to the required standard will likely be a slow process. The requirement to implement initial codes and guidance is significant and is unlikely to allow capacity for revision. Furthermore, the Secretary of State’s ability to stipulate such revisions could hamper that. To that end, it is essential that the first iteration of the codes of practice is robust enough to endure without the need for revision in the short term. Although that might be difficult to achieve in an environment that moves as quickly as the digital space, it must be strived for, lest we end up with legislation that does not hold online platforms to account and does not protect victims of online harms as it should.

As legislators, we have a responsibility to ensure that the online world is a safe place for our children. We also have a responsibility to ensure that online platforms take their obligations seriously. I am pleased that the previous Government’s Online Safety Act delivers on both those points. I urge the Minister to ensure that it is fully implemented as soon as possible.

Graham Stringer Portrait Graham Stringer (in the Chair)
- Hansard - -

We have gained a considerable amount of time because of disciplined interventions and short speeches. I ask the Minister to ensure that there is a small amount of time at the end for the Member in charge to wind up.