Online Safety Act: Implementation Debate
Full Debate: Read Full DebateJeremy Wright
Main Page: Jeremy Wright (Conservative - Kenilworth and Southam)Department Debates - View all Jeremy Wright's debates with the Department for Science, Innovation & Technology
(1 day, 12 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move,
That this House has considered the implementation of the Online Safety Act 2023.
It is a great pleasure to serve under your chairmanship, Mr Stringer, and I am grateful for the opportunity to open the debate. Let me start with some positives. The Online Safety Act 2023 is certainly not the last word on the subject, but it is, in my view, a big step forward in online safety, providing a variety of tools that allow the regulator to make the online world safer, particularly for children. I remain of the view that Ofcom is the right regulator for the task, not least because it can start its work sooner as an existing regulator and given the overlap with its existing work—for example, on video-sharing platforms. I also have great regard for the diligence and expertise of many at Ofcom who are now charged with these new responsibilities. However, I am concerned that Ofcom appears unwilling to use all the tools that the Act gives it to make the online world a safer place, and I am concerned that the Government appear unwilling to press Ofcom to be more ambitious. I want to explain why I am concerned, why I think it matters and what can be done about it.
Let me start with what I am worried about. There was a great deal of consensus about the passing of the Online Safety Act, and all of us involved in its development recognised both the urgent need to act on online harms and the enormity of the task. That means that the eventual version of the Act does not cover everything that is bad online and, of necessity, sets up a framework within which the regulator is required to fill in the gaps and has considerable latitude in doing so.
The architecture of that framework is important. Because we recognised that emerging harms would be more clearly and quickly seen by online services themselves than by legislators or regulators, in broad terms the Act requires online services to properly assess the risk of harms arising on their service and then to mitigate those risks. My concern is that Ofcom has taken an unnecessarily restrictive view of the harms it is asking services to assess and act on and, indeed, a view that is inconsistent with the terms of the Act. Specifically, my conversations with Ofcom suggest to me that it believes the Act only gives it power to act on harms that arise from the viewing of individual pieces of bad content. I do not agree, and let me explain why.
With limited exceptions, if an online service has not identified a risk in its risk assessment, it does not have to take action to reduce or eliminate that risk, so which risks are identified in the risk assessment really matters. That is why the Act sets out how a service should go about its risk assessment and what it should look out for. For services that may be accessed by children, the relevant risk assessment duties are set out in section 11 of the Act. Section 11(6) lists the matters that should be taken into account in a children’s risk assessment. Some of those undoubtedly refer to content, but some do not. Section 11(6)(e), for example, refers to
“the extent to which the design of the service, in particular its functionalities”
affects the risk of adults searching for and contacting children online. That is not a risk related to individual bits of content.
It is worth looking at section 11(6)(f), which, if colleagues will indulge me, I want to quote in full. It says that a risk assessment should include
“the different ways in which the service is used, including functionalities or other features of the service that affect how much children use the service (for example a feature that enables content to play automatically), and the impact of such use on the level of risk of harm that might be suffered by children”.
I think that that paragraph is talking about harms well beyond individual pieces of bad content. It is talking about damaging behaviours deliberately instigated by the design and operation of the online service, and the way its algorithms are designed to make us interact with it. That is a problem not just with excessive screen time, on which Ofcom has been conspicuously reluctant to engage, but with the issue of children being led from innocent material to darker and darker corners of the internet. We know that that is what happened to several of the young people whose suicides have been connected to their online activity. Algorithms designed to keep the user on the service for longer make that risk greater, and Ofcom seems reluctant to act on them despite the Act giving it powers to do so. We can see that from the draft code of practice on harm to children, which Ofcom published at the end of last year.
This debate is timely because the final version of the code of practice is due in the next couple of months. If Ofcom is to change course and broaden its characterisation of the risks that online services must act on—as I believe it should—now is the time. Many of the children’s welfare organisations that we all worked with so closely to deliver the Act in the first place are saying the same.
If Ofcom’s view of the harms to children on which services should act falls short of what the Act covers, why does it matter? Again, the answer lies in the architecture of the Act. The codes of practice that Ofcom drafts set out actions that services could take to meet their online safety duties. If they do the things that they set out, they are taken to have met the relevant safety duty and are safe from regulatory penalty. If in the code of practice Ofcom asks services to act only on content harms, it is highly likely that that is all services will do because it is compliance with the code that provides regulatory immunity. If it is not in the code, services probably will not do it. Codes that ignore some of the Act’s provisions to improve children’s safety means the online services that children use will ignore those provisions, too. We should all be worried about that.
That brings me to the second area where I believe that Ofcom has misinterpreted the Act. Throughout the passage of the Act, Parliament accepted that the demands that we make of online services to improve the safety of their users would have to be reasonable, not least to balance the risks of online activity with its benefits. In later iterations of the legislation, that balance is represented by the concept of proportionality in the measures that the regulator could require services to take. Again, Ofcom has been given much latitude to interpret proportionality. I am afraid that I do not believe it has done so consistently with Parliament’s intention. Ofcom’s view appears to be that for a measure to be proportionate there must be a substantial amount of evidence to demonstrate its effectiveness. That is not my reading of it.
Section 12 of the Act sets out the obligation on services to take proportionate measures to mitigate and manage risks to children. Section 13(1) offers more on what proportionate means in that context. It states:
“In determining what is proportionate for the purposes of section 12, the following factors, in particular, are relevant—
(a) all the findings of the most recent children’s risk assessment (including as to levels of risk and as to nature, and severity, of potential harm to children), and
(b) the size and capacity of the provider of a service.”
In other words, a measure that would be ruinously expensive or disruptive, especially for a smaller service, and which would deliver only a marginal safety benefit, should not be mandated, but a measure that brings a considerable safety improvement in responding to an identified risk, even if expensive, might well be justified.
Similarly, when it comes to measures recommended in a code of practice, schedule 4(2)(b) states those measures must be
“sufficiently clear, and at a sufficiently detailed level, that providers understand what those measures entail in practice”,
and schedule 4(2)(c) states that recommended measures must be “proportionate and technically feasible”, based on the size and capacity of the service. We should not ask anything of services they cannot do, and it should be clear what they have to do to comply. That is what the Act says proportionality means. I cannot find in the Act support for the idea that we have to know something will work before we try it in order for that action to be proportionate and therefore recommended in a code of practice. Why does that disagreement on interpretation matter? Because we should want online platforms and services to be innovative in how they fulfil their safety objectives, especially in the fast-moving landscape of online harms. I fear that Ofcom’s interpretation of proportionality, as requiring evidence of effectiveness, will achieve the opposite.
There will only be an evidence base on effectiveness for a measure that is already being taken somewhere, and that has been taken for long enough to generate that evidence of effectiveness. If we limit recommended actions to those that have evidence of success, we effectively set the bar for safety measures at current best practice. Given the safe harbour offered by measures recommended in codes of practice, that could mean services being deterred from innovating, because they get the protection only by doing things that are already being done.
I thank the right hon. and learned Gentleman for securing this incredibly important debate. He has described in his very good speech how inconsistency can occur across different platforms and providers. As a parent of a 14-year-old daughter who uses multiple apps and platforms, I want confidence about how they are regulated and that the security measures to keep her safe are consistent across all platforms she might access. My responsibility as a parent is to match that. The right hon. and learned Gentleman rightly highlights how Ofcom’s interpretation of the Act has led to inconsistencies and potential grey areas for bad faith actors to exploit, which will ultimately damage our children.
The hon. Gentleman makes an interesting point. We have to balance two things, though. We want consistency, as he suggests, but we also want platforms to respond to the circumstances of their own service, and to push the boundaries of what they can achieve by way of safety measures. As I said, they are in a better position to do so than legislators or regulators are to instruct them. The Act was always intended to put the onus on the platforms to take responsibility for their own safety measures. Given the variety of actors and different services in this space, we are probably not going to get a uniform approach, nor should we want one. The hon. Gentleman is right to say that the regulator needs to ensure that its expectations of everyone are high. There is a further risk not that we might just fix the bar at status quo but that, because of the opportunity that platforms have to innovate, some might go backwards on new safety measures that they are already implementing because they are not recommended or encouraged by Ofcom’s code of practice. That cannot be what we want to happen.
Those are two areas where I believe Ofcom’s interpretation of the Act is wrong and retreats in significant ways from Parliament’s intention to give the regulator power to act to enhance children’s online safety. I also believe it matters that it is wrong. The next question is what should be done about it. I accept that sometimes, as legislators, we have no choice but to pass framework legislation, with much of the detail on implementation to come later. That may be because the subject is incredibly complex, or because the subject is fast-moving. In the case of online safety, it is both.
Framework legislation raises serious questions about how Parliament ensures its intentions are followed through in all the subsequent work on implementation. What do we do if we have empowered regulators to act but their actions do not fulfil the expectations that we set out in legislation?
Does the right hon. and learned Gentleman agree that this is not only about Ofcom but regulators more widely, and their ability to be agile? Does he believe them to be more risk-averse in areas such as digital technology, relying on traditional consultation time periods, when the technology is moving way faster?
The hon. Gentleman identifies a real risk in this space: we are always playing catch-up, and so are the regulators. That is why we have tried—perhaps not entirely successfully—to design legislation that gives the regulators the capacity to move faster, but we have to ask them to do so and they have to take responsibility for that. I am raising these points because I am concerned that this particular regulator in this particular set of circumstances is not being as fleet of foot as it could be, but the hon. Gentleman is right that this is a concern across the regulatory piece. I would also say that regulators are not the only actor. We might expect the Government to pick up this issue and ensure that regulators do what Parliament expects, but in this area the signs are not encouraging.
As some Members in Westminster Hall this morning know because they were present during the debates on it, elsewhere in the Online Safety Act there is provision to bring forward secondary legislation to determine how online services are categorised, with category 1 services being subject to additional duties and expectations. That process was discussed extensively during the passage of the Act, and an amendment was made to it in the other place to ensure that smaller platforms with high incidences of harmful content could be included in category 1, along with larger platforms. That is an important change, because some of the harm that we are most concerned about may appear on smaller specialist platforms, or may go there to hide from the regulation of larger platforms. The previous Government accepted that amendment in this House, and the current Government actively supported it in opposition.
I am afraid, however, that Ofcom has now advised the Government to disregard that change, and the Government accepted that advice and brought a statutory instrument to Committee on 4 February that blatantly contravenes the will of Parliament and the content of primary legislation. It was a clear test case of the Government’s willingness to defend the ambition of the Online Safety Act, and I am afraid they showed no willingness to do so.
If we cannot rely on the Government to protect the extent of the Act—perhaps we should not, because regulatory independence from the Executive is important—who should do it? I am sure the Minister will say in due course that it falls within the remit of the Science, Innovation and Technology Committee. I mean no disrespect to that Committee, but it has a lot on its plate already and supervision of the fast-moving world of online safety regulation is a big job in itself. It is not, by the way, the only such job that needs doing. We have passed, or are in the process of passing, several other pieces of similar framework legislation in this area, including the Digital Markets, Competition and Consumers Act 2024, the Data (Use and Access) Bill and the Media Act 2024, all of which focus on regulators’ power to act and on the Secretary of State’s power to direct them. Parliament should have the means to oversee how that legislation is being implemented too.
Many of these areas overlap, of course, as regulators have recognised. They established the Digital Regulation Co-operation Forum to deal with the existing need to collaborate, which of course is only likely to grow with the pervasive development of artificial intelligence. Surely we should think about parliamentary oversight along the same lines. That is why I am not the first, nor the only, parliamentarian to be in favour of a new parliamentary Committee—preferably a Joint Committee, so that the expertise of many in the other place can be utilised—to scrutinise digital legislation. The Government have set their face against that idea so far, but I hope they will reconsider.
My final point is that there is urgency. The children’s safety codes will be finalised within weeks, and will set the tone for how ambitious and innovative—or otherwise—online services will be in keeping our children safe online. We should want the highest possible ambition, not a reinforcement of the status quo. Ofcom will say, and has said, that it can always do more in future iterations of the codes, but realistically the first version will stand for years before it is revised, and there will be many missed opportunities to make a child’s online world safer in that time. It is even less likely that new primary legislation will come along to plug any gaps anytime soon.
As the responsible Secretary of State, I signed off the online harms White Paper in 2019. Here we are in 2025, and the Online Safety Act is still not yet fully in force. We must do the most we can with the legislation we have, and I fear that we are not.
Given the efforts that were made all across the House and well beyond it to deliver the best possible set of legislative powers in this vital area, timidity and lack of ambition on the part of Ministers or regulators—leading to a pulling back from the borders of this Act—is not just a challenge to parliamentary sovereignty but, much more importantly, a dereliction of duty to the vulnerable members of our society, whose online safety is our collective responsibility. There is still time to be braver and ensure that the Online Safety Act fulfils its potential. That is what Ofcom and the Government need to do.
I remind hon. and right hon. Members to bob if they wish to speak. I intend to call the Front-Bench spokespeople at half-past 10 so I will impose a four-minute limit on speeches. That gives very little scope for interventions though it is up to hon. Members whether to take them, but I may have to reduce the time limit.
Ofcom has had to spend a long time consulting on the codes to ensure that they are as proofed against judicial review as possible. Any re-consultation or review of the codes will result in a delay, and the best way to ensure that we can protect children is to implement the Act as soon as possible. My hon. Friend referred to the fact that both Ofcom and the Secretary of State have said that this is not a done deal; it is an iterative process, so of course we expect those codes to be reviewed.
As I said, Ofcom is moving forward with implementation of the Act. In a matter of weeks we will start to see, for the first time, safety duties making a material difference to online experiences for adults and children. Platforms are already duty-bound to assess the risk of illegal content and, with a deadline of 16 March, to complete risk assessments. Once legal harm codes come into effect from 17 March, Ofcom will be able to enforce legal safety duties. Shortly following that in April, Ofcom will publish the child safety codes and associated guidance, starting the clock for services to assess the risk of content harmful to children on their platforms. The child safety duties should be fully in effect by the summer.
My hon. Friend the Member for Darlington also raised the issue of dynamic risk assessment. I understand that she is in conversation with Ofcom and Ministers on that. I will await the outcome of those discussions. The implementation of the Act will bring in long overdue measures, such as preventing children from accessing pornography and legal content encouraging suicide, self-harm or eating disorders.
I have heard concerns raised by hon. Members regarding Ofcom’s approach, particularly to harmful functionalities and safety by design. We understand there is still a lot of work to be done, which is why the Secretary of State’s statement of strategic priorities places a high importance on safety by design. However, it is important not to lose sight of the positive steps we expect to see this year under the Act. For instance, Ofcom’s draft child codes already include specific measures to address harmful algorithms, among other safety recommendations. We expect Ofcom will continue to build on those important measures in the codes.
Questions were asked about whether the Government have plans to water down the Act. I can categorically state that there are no plans to water down the measures. The Secretary of State has made it very clear that any social media company that wants to operate in our society will have to comply with the law of the land. Whatever changes are made in other jurisdictions, the law of the land will remain.
The Minister might be about to come to the point I want to raise with her, which is about proportionality. Will she say something about that? I am keen to understand whether the Government accept Ofcom’s understanding of the term—that proportional measures are those measures that can be evidenced as effective. I gave reasons why I am concerned about that. I want to understand whether the Government believe that that is the correct interpretation of proportionality.
I was about to come to the point that the right hon. and learned Member raised about the digital regulation Committee. I have had a brief conversation with him about that, and agree about the importance of parliamentary scrutiny of the implementation of the Online Safety Act. I welcome the expertise that Members of both Houses bring. Select Committees are a matter for the House, as he is aware.
We will continue to work with the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee to support their ongoing scrutiny, as well as other parliamentary Committees that may have an interest in the Act. The Act requires the Secretary of State to review the effectiveness of the regime, two to five years after the legislation comes into force. We will ensure that Parliament is central to that process. I encourage the right hon. and learned Member to continue to raise the matter with the right people.
Most hon. Members raised the issue of apps. Ofcom will have a duty to publish a report on the role of app stores and children’s accessing harmful content on the apps of regulated services. The report is due between January ’26 and January ’27. Once it is published, the Secretary of State may, if appropriate, make regulations to bring app stores into the scope of the Act. The timing will ensure that Ofcom can prioritise the implementation of child safety duties. I will write to the right hon. and learned Member for Kenilworth and Southam on the issue of proportionality, as I want to ensure that I give him the full details about how that is being interpreted by Ofcom.
We fully share the concerns of hon. Members over small platforms that host incredibly harmful content, such as hate forums. These dark corners of the internet are often deliberately sought out by individuals who are at risk of being radicalised.
I am grateful to everyone who has spoken in the debate. We have talked about the consensus there was in the passage of the Online Safety Bill. I think it is fair to say that that consensus is broadly still present, based on what Members have said this morning, and I am grateful for it.
There is a need to get this Act implemented. I accept what the Minister says about that, and others have made the same point: we do not want to make the best the enemy of the good, and there is always a trade-off between, on the one hand, getting the particular mechanisms that we know will protect people online in place as swiftly as possible, and on the other hand, making them as extensive and effective as possible.
However, given how long it takes for Parliament to make change—I make no apologies for repeating this point—we need to make the best use of the legislation that we have. I have not made a case this morning for extending the parameters of the legislation; I have made a case for using the parameters we already have, which Parliament has already legislated into being and which we have passed over to the regulator for it to use.
I accept that regulation and legislation is not passed for effect; we do it so that it can work. We do it not to make ourselves feel better, but to make the lives of our constituents better, so the Minister is right to say that the usability of all this should be at the heart of what we are interested in. I accept the point made by the hon. Member for Esher and Walton (Monica Harding) that Ofcom should not be predominantly focused on insulating itself from judicial review. As a former Law Officer, I think that is an impossible task anyway. This legislation and the regulation that follows it will be challenged—the online platforms have every incentive to challenge it. We cannot be so terrified of that prospect that we are unwilling to extend the parameters of the regulation as far as we believe they should go. That is why I think everybody needs to be a tad braver in all this.
Finally, I simply want to repeat the point that many of us have made, which is that we need as Parliament to have a way of keeping our eye on what is happening in this space. These debates are great, but shouting at Ofcom through the loudhailer of Westminster Hall is not as effective as a Committee set up to do this in a more structured and, frankly, a more productive and consensual way. That is the gap that exists in the landscape of parliamentary oversight, and as we develop more and more digital regulation, as we have to, and as AI advances, we will have to fill that gap. I simply say to the Government that filling it sooner rather than later would be wise.
Question put and agreed to.
Resolved,
That this House has considered the implementation of the Online Safety Act 2023.