Online Safety Act: Implementation Debate
Full Debate: Read Full DebateLola McEvoy
Main Page: Lola McEvoy (Labour - Darlington)Department Debates - View all Lola McEvoy's debates with the Department for Science, Innovation & Technology
(1 day, 16 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a pleasure to serve under your chairmanship, Mr Stringer. I pay tribute to the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) for his exceptional work and for his collegiate approach to this issue. In the interests of time, I will dive straight into the detail of what Ofcom is at risk of failing on in the implementation of its children’s safety codes.
As a trade union organiser, I know more than most about risk assessments and how they can be used in practice to protect people. A static risk assessment, as is required by the Act, will be used to assess the risk at that point in time; there will be a legal requirement to update or check that assessment within a year of its first iterance. A static risk assessment will assess the risk broadly, and if the online platforms adhere to the assessment, they will be in keeping with the legislation and will be given safe harbour, as has already been covered. That is not sufficient for the cohort of people using the platform at this time. The protection of children codes that are being published in April must require the use of a dynamic risk assessment.
Dynamic risk assessment is used by the Ministry of Defence, the NHS and several other work environments where the cohort they work with is vulnerable or at risk of injury or harm, and/or where the staff are at risk of injury from the work they do. Dynamic risk assessments are updated in real time. If the risk cannot be mitigated in real time, the activity must be stopped. I cannot fathom why these assessments are not being incorporated in the first iterance of the children’s codes. They would require the platforms to act in real time when they see children coming to harm, engaging in harmful behaviours or being exposed to harmful content. We know that myriad problems will arise when the codes are implemented. I believe strongly that if a dynamic risk assessment is included for those who say that they have children on their platforms, children will be safer in real time.
This is important not only because a dynamic risk assessment is enhanced, but because it makes sure that there is a point person responsible for that work. A point person at the platforms is already included in the Online Safety Act, responsible for being in touch with the Government and Ofcom and for implementing the measures in the Act. A DRA would mean that there was a responsible point person looking in real time to protect children. That is the first point.
I have several other points to make, but only a tiny amount of time. First, it is clear to me that functionalities should be included in the scope of the Act. I have spoken to Ofcom and to the platforms about it. The platforms are already including functionalities in their preliminary risk assessments, so their reading of the Act is that functionalities must be included. If they are going further already, I do not know why Ofcom would not stipulate that they continue to do so. Ofcom’s desire to include a toggle on and off mechanism for some of the functionalities is not sufficient to protect children because, as many of us who have been involved in these debates for a long time know, children will just switch them on. It is not sufficient to have a default off option either.
I will also touch on Jools’ law. As we have previously discussed in the Chamber, we need an amendment to make sure that in the tragic event of a child's death, a notice is automatically issued to the regulated online platforms to freeze the child’s accounts to protect them from deletion and to protect the data for the families going through an inquest. I pay tribute to the bereaved families who have worked on this. Finally, on timing, we have heard that any changes to the codes will delay implementation. I do not agree with that.
It is a pleasure to serve under your chairmanship, Mr Stringer. I thank the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) for securing this debate on the implementation of the Online Safety Act. I know that he has been following the Bill throughout its passage and has been a critic of every Minister, even his Government’s Ministers, whenever the Bill was watered down or delayed, so I expect him to hold all of us to account. I am grateful to him and all the hon. Members who have spoken this morning. The Government share their commitment to keeping users safe online. It is crucial that we continue to have conversations about how best to achieve that goal.
The Online Safety Act lays the foundations for strong protections against evil content and harmful material online. It addresses the complex nature of online harm, recognising that harm is not limited to explicit content and extending to the design and functionality of online services. We know that the legislation is not perfect. I hear that at every such debate, but we are committed to supporting Ofcom to ensure that the Act is implemented quickly, as this is the fastest way to protect people online. 2025 is the year of action for online safety, and the Government have already taken a number of steps to build on Ofcom’s implementation of the Act. In November last year, the Secretary of State published the draft “Statement Of Strategic Priorities for online safety”. That statement is designed to deliver a comprehensive, forward-looking set of online safety priorities for the full term of this Government. It will give Ofcom a backing to be bold on specific areas, such as embedding safety by design, through considering all aspects of a service’s business model, including functionalities and algorithms.
We are also working to build further on the evidence base to inform our next steps on online safety, and I know that this issue was debated earlier this week. In December, we announced a feasibility study to understand the impact of smartphones and social media on children, and in the Data (Use and Access) Bill, we have included provisions to allow the Secretary of State to create a new researcher access regime for online safety data. That regime is working to fix a systemic issue that has historically prevented researchers from understanding how platforms operate, and it will help to identify and mitigate new and preventable harms. We have also made updates to the framework, such as strengthening measures to tackle intimate image abuse under the Online Safety Act, and we are following up on our manifesto commitment to hold perpetrators to account for the creation of explicit, non-consensual deepfake images through amendments to the Data Bill.
We are also building on the measures in the Online Safety Act that allow Ofcom to take information on behalf of coroners. Through the Data Bill, we are bringing in additional powers to allow coroners to request Ofcom to issue a notice requiring platforms to preserve children’s data, which can be crucial for investigations into a child’s tragic death. My hon. Friend the Member for Darlington (Lola McEvoy) raised Jools’ law, of which I am very aware, and I believe that she is meeting Ministers this week to discuss it further.
Finally, we recently announced that, in the upcoming Crime and Policing Bill, we are introducing multiple offences to tackle AI sexual abuse, including a new offence for possessing, creating or supplying AI tools designed to generate child sexual abuse material.
Members have raised the issue of the Act’s implementation being too slow. We are aware of the frustrations over the amount of time that it has taken to implement the Online Safety Act, not least because of the importance of the issues at hand. We are committed to working with Ofcom to ensure that the Online Safety Act is implemented as quickly and effectively as possible.
On implementation, would the Minister give clarity about the watermark for re-consultation and the point of delay of implementing the children’s codes under the Act? Amendments could be made to the children’s codes and I do not think they would trigger an automatic re-consultation with platforms. Could the Minister elaborate on where the delay would come from and how much scope Parliament has to amend those codes, which will be published in April?
Ofcom has had to spend a long time consulting on the codes to ensure that they are as proofed against judicial review as possible. Any re-consultation or review of the codes will result in a delay, and the best way to ensure that we can protect children is to implement the Act as soon as possible. My hon. Friend referred to the fact that both Ofcom and the Secretary of State have said that this is not a done deal; it is an iterative process, so of course we expect those codes to be reviewed.
As I said, Ofcom is moving forward with implementation of the Act. In a matter of weeks we will start to see, for the first time, safety duties making a material difference to online experiences for adults and children. Platforms are already duty-bound to assess the risk of illegal content and, with a deadline of 16 March, to complete risk assessments. Once legal harm codes come into effect from 17 March, Ofcom will be able to enforce legal safety duties. Shortly following that in April, Ofcom will publish the child safety codes and associated guidance, starting the clock for services to assess the risk of content harmful to children on their platforms. The child safety duties should be fully in effect by the summer.
My hon. Friend the Member for Darlington also raised the issue of dynamic risk assessment. I understand that she is in conversation with Ofcom and Ministers on that. I will await the outcome of those discussions. The implementation of the Act will bring in long overdue measures, such as preventing children from accessing pornography and legal content encouraging suicide, self-harm or eating disorders.
I have heard concerns raised by hon. Members regarding Ofcom’s approach, particularly to harmful functionalities and safety by design. We understand there is still a lot of work to be done, which is why the Secretary of State’s statement of strategic priorities places a high importance on safety by design. However, it is important not to lose sight of the positive steps we expect to see this year under the Act. For instance, Ofcom’s draft child codes already include specific measures to address harmful algorithms, among other safety recommendations. We expect Ofcom will continue to build on those important measures in the codes.
Questions were asked about whether the Government have plans to water down the Act. I can categorically state that there are no plans to water down the measures. The Secretary of State has made it very clear that any social media company that wants to operate in our society will have to comply with the law of the land. Whatever changes are made in other jurisdictions, the law of the land will remain.
The Government are confident that the duties to tackle illegal content and, where relevant, protect children from harmful content will have a meaningful impact on the small but risky services to which the hon. Gentleman refers. Ofcom has created a dedicated supervision taskforce for small but high-risk services, recognising the need for a bespoke approach to securing compliance. The team will focus on high-priority risks, such as CSAM, suicide and hate offences directed at women and girls. Where services do not engage with Ofcom and where there is evidence of non-compliance, Ofcom will move quickly to enforcement action, starting with illegal harm duties from 17 March, so work is being done on that.
The comprehensive legal safety duties will be applied to all user-to-user forums, and child safety duties will be applied to all user-to-user forums likely to be accessed by children, including the small but high-risk sites. These duties will have the most impact in holding the services to account. Because of the deep concerns about these forums, Ofcom has, as I said, created the small but risky supervision taskforce. For example, Ofcom will be asking an initial set of firms that pose a particular risk, including smaller sites, to disclose their illegal content risk assessment by 31 March.
The Government have been clear that we will act where there is evidence that harm is not being adequately addressed despite the duties being in effect, and we have been clear to Ofcom that it has the Government’s and Parliament’s backing to be bold in the implementation of the Online Safety Act. We are in clear agreement that the Act is not the end of the road, and Ofcom has already committed to iterating on the codes of practice, with the first consultation on further measures being launched this spring. The Government remain open minded as to how we ensure that users are kept safe online, and where we need to act, we will. To do so, we must ensure that the actions we take are carefully considered and rooted in evidence.
Will the consultation this spring for the next iterations of the codes include consultation with parliamentarians, or is it solely with platforms?
I expect any consultation will have to go through the Secretary of State, and I am sure it will be debated and will come to the House for discussion, but I will happily provide my hon. Friend with more detail on that.
I am grateful to all Members for their contributions to the debate. I look forward to working with the right hon. and learned Member for Kenilworth and Southam, and hopefully he can secure the Committee that he has raised.