Online Safety Act: Implementation Debate
Full Debate: Read Full DebateFeryal Clark
Main Page: Feryal Clark (Labour - Enfield North)Department Debates - View all Feryal Clark's debates with the Department for Science, Innovation & Technology
(1 day, 12 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Mr Stringer. I thank the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) for securing this debate on the implementation of the Online Safety Act. I know that he has been following the Bill throughout its passage and has been a critic of every Minister, even his Government’s Ministers, whenever the Bill was watered down or delayed, so I expect him to hold all of us to account. I am grateful to him and all the hon. Members who have spoken this morning. The Government share their commitment to keeping users safe online. It is crucial that we continue to have conversations about how best to achieve that goal.
The Online Safety Act lays the foundations for strong protections against evil content and harmful material online. It addresses the complex nature of online harm, recognising that harm is not limited to explicit content and extending to the design and functionality of online services. We know that the legislation is not perfect. I hear that at every such debate, but we are committed to supporting Ofcom to ensure that the Act is implemented quickly, as this is the fastest way to protect people online. 2025 is the year of action for online safety, and the Government have already taken a number of steps to build on Ofcom’s implementation of the Act. In November last year, the Secretary of State published the draft “Statement Of Strategic Priorities for online safety”. That statement is designed to deliver a comprehensive, forward-looking set of online safety priorities for the full term of this Government. It will give Ofcom a backing to be bold on specific areas, such as embedding safety by design, through considering all aspects of a service’s business model, including functionalities and algorithms.
We are also working to build further on the evidence base to inform our next steps on online safety, and I know that this issue was debated earlier this week. In December, we announced a feasibility study to understand the impact of smartphones and social media on children, and in the Data (Use and Access) Bill, we have included provisions to allow the Secretary of State to create a new researcher access regime for online safety data. That regime is working to fix a systemic issue that has historically prevented researchers from understanding how platforms operate, and it will help to identify and mitigate new and preventable harms. We have also made updates to the framework, such as strengthening measures to tackle intimate image abuse under the Online Safety Act, and we are following up on our manifesto commitment to hold perpetrators to account for the creation of explicit, non-consensual deepfake images through amendments to the Data Bill.
We are also building on the measures in the Online Safety Act that allow Ofcom to take information on behalf of coroners. Through the Data Bill, we are bringing in additional powers to allow coroners to request Ofcom to issue a notice requiring platforms to preserve children’s data, which can be crucial for investigations into a child’s tragic death. My hon. Friend the Member for Darlington (Lola McEvoy) raised Jools’ law, of which I am very aware, and I believe that she is meeting Ministers this week to discuss it further.
Finally, we recently announced that, in the upcoming Crime and Policing Bill, we are introducing multiple offences to tackle AI sexual abuse, including a new offence for possessing, creating or supplying AI tools designed to generate child sexual abuse material.
Members have raised the issue of the Act’s implementation being too slow. We are aware of the frustrations over the amount of time that it has taken to implement the Online Safety Act, not least because of the importance of the issues at hand. We are committed to working with Ofcom to ensure that the Online Safety Act is implemented as quickly and effectively as possible.
On implementation, would the Minister give clarity about the watermark for re-consultation and the point of delay of implementing the children’s codes under the Act? Amendments could be made to the children’s codes and I do not think they would trigger an automatic re-consultation with platforms. Could the Minister elaborate on where the delay would come from and how much scope Parliament has to amend those codes, which will be published in April?
Ofcom has had to spend a long time consulting on the codes to ensure that they are as proofed against judicial review as possible. Any re-consultation or review of the codes will result in a delay, and the best way to ensure that we can protect children is to implement the Act as soon as possible. My hon. Friend referred to the fact that both Ofcom and the Secretary of State have said that this is not a done deal; it is an iterative process, so of course we expect those codes to be reviewed.
As I said, Ofcom is moving forward with implementation of the Act. In a matter of weeks we will start to see, for the first time, safety duties making a material difference to online experiences for adults and children. Platforms are already duty-bound to assess the risk of illegal content and, with a deadline of 16 March, to complete risk assessments. Once legal harm codes come into effect from 17 March, Ofcom will be able to enforce legal safety duties. Shortly following that in April, Ofcom will publish the child safety codes and associated guidance, starting the clock for services to assess the risk of content harmful to children on their platforms. The child safety duties should be fully in effect by the summer.
My hon. Friend the Member for Darlington also raised the issue of dynamic risk assessment. I understand that she is in conversation with Ofcom and Ministers on that. I will await the outcome of those discussions. The implementation of the Act will bring in long overdue measures, such as preventing children from accessing pornography and legal content encouraging suicide, self-harm or eating disorders.
I have heard concerns raised by hon. Members regarding Ofcom’s approach, particularly to harmful functionalities and safety by design. We understand there is still a lot of work to be done, which is why the Secretary of State’s statement of strategic priorities places a high importance on safety by design. However, it is important not to lose sight of the positive steps we expect to see this year under the Act. For instance, Ofcom’s draft child codes already include specific measures to address harmful algorithms, among other safety recommendations. We expect Ofcom will continue to build on those important measures in the codes.
Questions were asked about whether the Government have plans to water down the Act. I can categorically state that there are no plans to water down the measures. The Secretary of State has made it very clear that any social media company that wants to operate in our society will have to comply with the law of the land. Whatever changes are made in other jurisdictions, the law of the land will remain.
The Minister might be about to come to the point I want to raise with her, which is about proportionality. Will she say something about that? I am keen to understand whether the Government accept Ofcom’s understanding of the term—that proportional measures are those measures that can be evidenced as effective. I gave reasons why I am concerned about that. I want to understand whether the Government believe that that is the correct interpretation of proportionality.
I was about to come to the point that the right hon. and learned Member raised about the digital regulation Committee. I have had a brief conversation with him about that, and agree about the importance of parliamentary scrutiny of the implementation of the Online Safety Act. I welcome the expertise that Members of both Houses bring. Select Committees are a matter for the House, as he is aware.
We will continue to work with the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee to support their ongoing scrutiny, as well as other parliamentary Committees that may have an interest in the Act. The Act requires the Secretary of State to review the effectiveness of the regime, two to five years after the legislation comes into force. We will ensure that Parliament is central to that process. I encourage the right hon. and learned Member to continue to raise the matter with the right people.
Most hon. Members raised the issue of apps. Ofcom will have a duty to publish a report on the role of app stores and children’s accessing harmful content on the apps of regulated services. The report is due between January ’26 and January ’27. Once it is published, the Secretary of State may, if appropriate, make regulations to bring app stores into the scope of the Act. The timing will ensure that Ofcom can prioritise the implementation of child safety duties. I will write to the right hon. and learned Member for Kenilworth and Southam on the issue of proportionality, as I want to ensure that I give him the full details about how that is being interpreted by Ofcom.
We fully share the concerns of hon. Members over small platforms that host incredibly harmful content, such as hate forums. These dark corners of the internet are often deliberately sought out by individuals who are at risk of being radicalised.
If the Government fully support our concerns about small but harmful sites, will the statutory instrument be reworked to bring them back into category 1, as the Act states?
The Government are confident that the duties to tackle illegal content and, where relevant, protect children from harmful content will have a meaningful impact on the small but risky services to which the hon. Gentleman refers. Ofcom has created a dedicated supervision taskforce for small but high-risk services, recognising the need for a bespoke approach to securing compliance. The team will focus on high-priority risks, such as CSAM, suicide and hate offences directed at women and girls. Where services do not engage with Ofcom and where there is evidence of non-compliance, Ofcom will move quickly to enforcement action, starting with illegal harm duties from 17 March, so work is being done on that.
The comprehensive legal safety duties will be applied to all user-to-user forums, and child safety duties will be applied to all user-to-user forums likely to be accessed by children, including the small but high-risk sites. These duties will have the most impact in holding the services to account. Because of the deep concerns about these forums, Ofcom has, as I said, created the small but risky supervision taskforce. For example, Ofcom will be asking an initial set of firms that pose a particular risk, including smaller sites, to disclose their illegal content risk assessment by 31 March.
The Government have been clear that we will act where there is evidence that harm is not being adequately addressed despite the duties being in effect, and we have been clear to Ofcom that it has the Government’s and Parliament’s backing to be bold in the implementation of the Online Safety Act. We are in clear agreement that the Act is not the end of the road, and Ofcom has already committed to iterating on the codes of practice, with the first consultation on further measures being launched this spring. The Government remain open minded as to how we ensure that users are kept safe online, and where we need to act, we will. To do so, we must ensure that the actions we take are carefully considered and rooted in evidence.
Will the consultation this spring for the next iterations of the codes include consultation with parliamentarians, or is it solely with platforms?
I expect any consultation will have to go through the Secretary of State, and I am sure it will be debated and will come to the House for discussion, but I will happily provide my hon. Friend with more detail on that.
I am grateful to all Members for their contributions to the debate. I look forward to working with the right hon. and learned Member for Kenilworth and Southam, and hopefully he can secure the Committee that he has raised.
Can the Minister explain what she meant when she said that Ofcom had to ensure that the codes were as judicial review-proofed as possible? Surely Ofcom’s approach should be to ensure that the codes protect vulnerable users, rather than be judicial review-proofed.
The point I was trying to make was that Ofcom is spending time ensuring that it gets the codes right and can implement them as soon as possible, without being delayed by any potential challenge. To avoid any challenge, it must ensure that it gets the codes right.