Online Safety Act: Implementation Debate
Full Debate: Read Full DebateMonica Harding
Main Page: Monica Harding (Liberal Democrat - Esher and Walton)Department Debates - View all Monica Harding's debates with the Department for Science, Innovation & Technology
(1 day, 15 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Mr Stringer. I thank the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) for organising this important debate and for his continued work scrutinising this legislation.
The Online Safety Act was a landmark step towards making the internet a safer place, particularly for our children, but its implementation has fallen far short of what Parliament intended, hampered by Ofcom’s slow pace and limited ambition. Initially, the Act was designed to ensure that tech companies take responsibility for protecting users, especially children, from harmful content, but the current approach taken by Ofcom undermines that intent in several ways.
We have waited more than a year for Ofcom to complete its consultation on the illegal content codes of practice, but those codes fail to enforce a robust safety-by-design approach. Instead of proactively mitigating risks, many of its measures focus only on responding to harm after it has already occurred, and the children’s safety codes, which are still in draft, appear to follow a similarly disappointing trajectory. Features such as livestreaming, ephemeral content and recommender algorithms—tools that are frequently exploited for the purpose of online abuse—are also not meaningfully addressed in the current framework.
The Act has significant shortcomings in that it also allows companies to be deemed compliant simply by following Ofcom’s codes, regardless of whether their platforms remain unsafe in reality. This means that tech giants are permitted to hide behind a regulatory shield rather than being forced to address known risks on their platforms; all the while, children continue to be exposed to harm. The Act also explicitly requires protections tailored to different age groups, but in its implementation of it, Ofcom treats a seven-year-old and a 17-year-old as if their online safety needs are identical. In doing so, it has fundamentally failed to recognise how children’s development affects their online experiences and their vulnerabilities.
The action on fake and anonymous accounts has been slow and weak. This was a huge area of focus for parliamentarians before the Act was passed, and Ofcom itself identified it as a major risk factor in crimes such as terrorism, child sexual exploitation, harassment and fraud. As we approach 18 months since the passage of the Act, there has been no change for UK users. Instead of prioritising verification measures, Ofcom has pushed them to a later phase of implementation, delaying real action until at least 2027. That is unacceptable, especially when Ofcom’s own research shows that over 60% of eight to 11-year-olds are active on social media, despite existing age restrictions prohibiting it.
The Government’s and Ofcom’s delays in introducing user identity verification measures are unacceptable. The harms associated with fake and anonymous accounts are deeply personal and painfully real, with millions of Britons suffering from online abuse, scams and harassment each year. I hope the Minister can provide a robust explanation for the timidity and delay, and rule out any suggestion that the delays were a result of lobbying pressures from platforms. The best assurance she could give today would be a commitment that the introduction of verification measures will be brought forward to 2026, so that UK internet users are better protected.
In short, I ask the Minister to recognise the urgency of taking the following action. Ofcom must revise its codes to require proactive risk mitigation; tech companies should not be allowed to claim compliance with the regulatory framework, all the while continuing to expose users to harm; platforms must be held accountable if they fail to meet the real safety standards; and protections need to be specific to different age groups, so that younger children and teenagers receive appropriate levels of safety and access.
I expect any consultation will have to go through the Secretary of State, and I am sure it will be debated and will come to the House for discussion, but I will happily provide my hon. Friend with more detail on that.
I am grateful to all Members for their contributions to the debate. I look forward to working with the right hon. and learned Member for Kenilworth and Southam, and hopefully he can secure the Committee that he has raised.
Can the Minister explain what she meant when she said that Ofcom had to ensure that the codes were as judicial review-proofed as possible? Surely Ofcom’s approach should be to ensure that the codes protect vulnerable users, rather than be judicial review-proofed.
The point I was trying to make was that Ofcom is spending time ensuring that it gets the codes right and can implement them as soon as possible, without being delayed by any potential challenge. To avoid any challenge, it must ensure that it gets the codes right.