Online Safety Act: Implementation

Wednesday 26th February 2025

(1 day, 15 hours ago)

Westminster Hall
Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

10:00
Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- Hansard - - - Excerpts

I beg to move,

That this House has considered the implementation of the Online Safety Act 2023.

It is a great pleasure to serve under your chairmanship, Mr Stringer, and I am grateful for the opportunity to open the debate. Let me start with some positives. The Online Safety Act 2023 is certainly not the last word on the subject, but it is, in my view, a big step forward in online safety, providing a variety of tools that allow the regulator to make the online world safer, particularly for children. I remain of the view that Ofcom is the right regulator for the task, not least because it can start its work sooner as an existing regulator and given the overlap with its existing work—for example, on video-sharing platforms. I also have great regard for the diligence and expertise of many at Ofcom who are now charged with these new responsibilities. However, I am concerned that Ofcom appears unwilling to use all the tools that the Act gives it to make the online world a safer place, and I am concerned that the Government appear unwilling to press Ofcom to be more ambitious. I want to explain why I am concerned, why I think it matters and what can be done about it.

Let me start with what I am worried about. There was a great deal of consensus about the passing of the Online Safety Act, and all of us involved in its development recognised both the urgent need to act on online harms and the enormity of the task. That means that the eventual version of the Act does not cover everything that is bad online and, of necessity, sets up a framework within which the regulator is required to fill in the gaps and has considerable latitude in doing so.

The architecture of that framework is important. Because we recognised that emerging harms would be more clearly and quickly seen by online services themselves than by legislators or regulators, in broad terms the Act requires online services to properly assess the risk of harms arising on their service and then to mitigate those risks. My concern is that Ofcom has taken an unnecessarily restrictive view of the harms it is asking services to assess and act on and, indeed, a view that is inconsistent with the terms of the Act. Specifically, my conversations with Ofcom suggest to me that it believes the Act only gives it power to act on harms that arise from the viewing of individual pieces of bad content. I do not agree, and let me explain why.

With limited exceptions, if an online service has not identified a risk in its risk assessment, it does not have to take action to reduce or eliminate that risk, so which risks are identified in the risk assessment really matters. That is why the Act sets out how a service should go about its risk assessment and what it should look out for. For services that may be accessed by children, the relevant risk assessment duties are set out in section 11 of the Act. Section 11(6) lists the matters that should be taken into account in a children’s risk assessment. Some of those undoubtedly refer to content, but some do not. Section 11(6)(e), for example, refers to

“the extent to which the design of the service, in particular its functionalities”

affects the risk of adults searching for and contacting children online. That is not a risk related to individual bits of content.

It is worth looking at section 11(6)(f), which, if colleagues will indulge me, I want to quote in full. It says that a risk assessment should include

“the different ways in which the service is used, including functionalities or other features of the service that affect how much children use the service (for example a feature that enables content to play automatically), and the impact of such use on the level of risk of harm that might be suffered by children”.

I think that that paragraph is talking about harms well beyond individual pieces of bad content. It is talking about damaging behaviours deliberately instigated by the design and operation of the online service, and the way its algorithms are designed to make us interact with it. That is a problem not just with excessive screen time, on which Ofcom has been conspicuously reluctant to engage, but with the issue of children being led from innocent material to darker and darker corners of the internet. We know that that is what happened to several of the young people whose suicides have been connected to their online activity. Algorithms designed to keep the user on the service for longer make that risk greater, and Ofcom seems reluctant to act on them despite the Act giving it powers to do so. We can see that from the draft code of practice on harm to children, which Ofcom published at the end of last year.

This debate is timely because the final version of the code of practice is due in the next couple of months. If Ofcom is to change course and broaden its characterisation of the risks that online services must act on—as I believe it should—now is the time. Many of the children’s welfare organisations that we all worked with so closely to deliver the Act in the first place are saying the same.

If Ofcom’s view of the harms to children on which services should act falls short of what the Act covers, why does it matter? Again, the answer lies in the architecture of the Act. The codes of practice that Ofcom drafts set out actions that services could take to meet their online safety duties. If they do the things that they set out, they are taken to have met the relevant safety duty and are safe from regulatory penalty. If in the code of practice Ofcom asks services to act only on content harms, it is highly likely that that is all services will do because it is compliance with the code that provides regulatory immunity. If it is not in the code, services probably will not do it. Codes that ignore some of the Act’s provisions to improve children’s safety means the online services that children use will ignore those provisions, too. We should all be worried about that.

That brings me to the second area where I believe that Ofcom has misinterpreted the Act. Throughout the passage of the Act, Parliament accepted that the demands that we make of online services to improve the safety of their users would have to be reasonable, not least to balance the risks of online activity with its benefits. In later iterations of the legislation, that balance is represented by the concept of proportionality in the measures that the regulator could require services to take. Again, Ofcom has been given much latitude to interpret proportionality. I am afraid that I do not believe it has done so consistently with Parliament’s intention. Ofcom’s view appears to be that for a measure to be proportionate there must be a substantial amount of evidence to demonstrate its effectiveness. That is not my reading of it.

Section 12 of the Act sets out the obligation on services to take proportionate measures to mitigate and manage risks to children. Section 13(1) offers more on what proportionate means in that context. It states:

“In determining what is proportionate for the purposes of section 12, the following factors, in particular, are relevant—

(a) all the findings of the most recent children’s risk assessment (including as to levels of risk and as to nature, and severity, of potential harm to children), and

(b) the size and capacity of the provider of a service.”

In other words, a measure that would be ruinously expensive or disruptive, especially for a smaller service, and which would deliver only a marginal safety benefit, should not be mandated, but a measure that brings a considerable safety improvement in responding to an identified risk, even if expensive, might well be justified.

Similarly, when it comes to measures recommended in a code of practice, schedule 4(2)(b) states those measures must be

“sufficiently clear, and at a sufficiently detailed level, that providers understand what those measures entail in practice”,

and schedule 4(2)(c) states that recommended measures must be “proportionate and technically feasible”, based on the size and capacity of the service. We should not ask anything of services they cannot do, and it should be clear what they have to do to comply. That is what the Act says proportionality means. I cannot find in the Act support for the idea that we have to know something will work before we try it in order for that action to be proportionate and therefore recommended in a code of practice. Why does that disagreement on interpretation matter? Because we should want online platforms and services to be innovative in how they fulfil their safety objectives, especially in the fast-moving landscape of online harms. I fear that Ofcom’s interpretation of proportionality, as requiring evidence of effectiveness, will achieve the opposite.

There will only be an evidence base on effectiveness for a measure that is already being taken somewhere, and that has been taken for long enough to generate that evidence of effectiveness. If we limit recommended actions to those that have evidence of success, we effectively set the bar for safety measures at current best practice. Given the safe harbour offered by measures recommended in codes of practice, that could mean services being deterred from innovating, because they get the protection only by doing things that are already being done.

Gareth Snell Portrait Gareth Snell (Stoke-on-Trent Central) (Lab/Co-op)
- Hansard - - - Excerpts

I thank the right hon. and learned Gentleman for securing this incredibly important debate. He has described in his very good speech how inconsistency can occur across different platforms and providers. As a parent of a 14-year-old daughter who uses multiple apps and platforms, I want confidence about how they are regulated and that the security measures to keep her safe are consistent across all platforms she might access. My responsibility as a parent is to match that. The right hon. and learned Gentleman rightly highlights how Ofcom’s interpretation of the Act has led to inconsistencies and potential grey areas for bad faith actors to exploit, which will ultimately damage our children.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

The hon. Gentleman makes an interesting point. We have to balance two things, though. We want consistency, as he suggests, but we also want platforms to respond to the circumstances of their own service, and to push the boundaries of what they can achieve by way of safety measures. As I said, they are in a better position to do so than legislators or regulators are to instruct them. The Act was always intended to put the onus on the platforms to take responsibility for their own safety measures. Given the variety of actors and different services in this space, we are probably not going to get a uniform approach, nor should we want one. The hon. Gentleman is right to say that the regulator needs to ensure that its expectations of everyone are high. There is a further risk not that we might just fix the bar at status quo but that, because of the opportunity that platforms have to innovate, some might go backwards on new safety measures that they are already implementing because they are not recommended or encouraged by Ofcom’s code of practice. That cannot be what we want to happen.

Those are two areas where I believe Ofcom’s interpretation of the Act is wrong and retreats in significant ways from Parliament’s intention to give the regulator power to act to enhance children’s online safety. I also believe it matters that it is wrong. The next question is what should be done about it. I accept that sometimes, as legislators, we have no choice but to pass framework legislation, with much of the detail on implementation to come later. That may be because the subject is incredibly complex, or because the subject is fast-moving. In the case of online safety, it is both.

Framework legislation raises serious questions about how Parliament ensures its intentions are followed through in all the subsequent work on implementation. What do we do if we have empowered regulators to act but their actions do not fulfil the expectations that we set out in legislation?

Graham Leadbitter Portrait Graham Leadbitter (Moray West, Nairn and Strathspey) (SNP)
- Hansard - - - Excerpts

Does the right hon. and learned Gentleman agree that this is not only about Ofcom but regulators more widely, and their ability to be agile? Does he believe them to be more risk-averse in areas such as digital technology, relying on traditional consultation time periods, when the technology is moving way faster?

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

The hon. Gentleman identifies a real risk in this space: we are always playing catch-up, and so are the regulators. That is why we have tried—perhaps not entirely successfully—to design legislation that gives the regulators the capacity to move faster, but we have to ask them to do so and they have to take responsibility for that. I am raising these points because I am concerned that this particular regulator in this particular set of circumstances is not being as fleet of foot as it could be, but the hon. Gentleman is right that this is a concern across the regulatory piece. I would also say that regulators are not the only actor. We might expect the Government to pick up this issue and ensure that regulators do what Parliament expects, but in this area the signs are not encouraging.

As some Members in Westminster Hall this morning know because they were present during the debates on it, elsewhere in the Online Safety Act there is provision to bring forward secondary legislation to determine how online services are categorised, with category 1 services being subject to additional duties and expectations. That process was discussed extensively during the passage of the Act, and an amendment was made to it in the other place to ensure that smaller platforms with high incidences of harmful content could be included in category 1, along with larger platforms. That is an important change, because some of the harm that we are most concerned about may appear on smaller specialist platforms, or may go there to hide from the regulation of larger platforms. The previous Government accepted that amendment in this House, and the current Government actively supported it in opposition.

I am afraid, however, that Ofcom has now advised the Government to disregard that change, and the Government accepted that advice and brought a statutory instrument to Committee on 4 February that blatantly contravenes the will of Parliament and the content of primary legislation. It was a clear test case of the Government’s willingness to defend the ambition of the Online Safety Act, and I am afraid they showed no willingness to do so.

If we cannot rely on the Government to protect the extent of the Act—perhaps we should not, because regulatory independence from the Executive is important—who should do it? I am sure the Minister will say in due course that it falls within the remit of the Science, Innovation and Technology Committee. I mean no disrespect to that Committee, but it has a lot on its plate already and supervision of the fast-moving world of online safety regulation is a big job in itself. It is not, by the way, the only such job that needs doing. We have passed, or are in the process of passing, several other pieces of similar framework legislation in this area, including the Digital Markets, Competition and Consumers Act 2024, the Data (Use and Access) Bill and the Media Act 2024, all of which focus on regulators’ power to act and on the Secretary of State’s power to direct them. Parliament should have the means to oversee how that legislation is being implemented too.

Many of these areas overlap, of course, as regulators have recognised. They established the Digital Regulation Co-operation Forum to deal with the existing need to collaborate, which of course is only likely to grow with the pervasive development of artificial intelligence. Surely we should think about parliamentary oversight along the same lines. That is why I am not the first, nor the only, parliamentarian to be in favour of a new parliamentary Committee—preferably a Joint Committee, so that the expertise of many in the other place can be utilised—to scrutinise digital legislation. The Government have set their face against that idea so far, but I hope they will reconsider.

My final point is that there is urgency. The children’s safety codes will be finalised within weeks, and will set the tone for how ambitious and innovative—or otherwise—online services will be in keeping our children safe online. We should want the highest possible ambition, not a reinforcement of the status quo. Ofcom will say, and has said, that it can always do more in future iterations of the codes, but realistically the first version will stand for years before it is revised, and there will be many missed opportunities to make a child’s online world safer in that time. It is even less likely that new primary legislation will come along to plug any gaps anytime soon.

As the responsible Secretary of State, I signed off the online harms White Paper in 2019. Here we are in 2025, and the Online Safety Act is still not yet fully in force. We must do the most we can with the legislation we have, and I fear that we are not.

Given the efforts that were made all across the House and well beyond it to deliver the best possible set of legislative powers in this vital area, timidity and lack of ambition on the part of Ministers or regulators—leading to a pulling back from the borders of this Act—is not just a challenge to parliamentary sovereignty but, much more importantly, a dereliction of duty to the vulnerable members of our society, whose online safety is our collective responsibility. There is still time to be braver and ensure that the Online Safety Act fulfils its potential. That is what Ofcom and the Government need to do.

Graham Stringer Portrait Graham Stringer (in the Chair)
- Hansard - - - Excerpts

I remind hon. and right hon. Members to bob if they wish to speak. I intend to call the Front-Bench spokespeople at half-past 10 so I will impose a four-minute limit on speeches. That gives very little scope for interventions though it is up to hon. Members whether to take them, but I may have to reduce the time limit.

09:51
Jess Asato Portrait Jess Asato (Lowestoft) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Stringer. I thank the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), the former Secretary of State, for securing today’s important debate.

I am proud to have worked on the Online Safety Act alongside colleagues in the women’s and children’s sectors, and to have successfully pushed, in particular, for stronger age verification measures to stop children from accessing harmful pornography. Given the abundant harms within the online world and the detrimental impact they have on young people’s development, the need for strong regulation was aways going to be necessary. Tech companies have no incentive to care for children when their profit motives compel them to create addictive content, purposely designed to keep kids hooked.

However, regulation is only ever as good as its ability to be enforced. It is clear from my conversations with those who care about children’s online safety that the regulator, Ofcom, needs to do better in many areas. Adequate regulation has never been needed more than now, in an era of a roll-back in online giants’ desires to protect and safeguard their users—from X to Meta—given changing political winds. Self-regulation has clearly failed and we must ensure that Ofcom’s implementation of the Online Safety Act is not loose enough to allow that to continue. I agree with the concerns raised by the right hon. and learned Member for Kenilworth and Southam; what we have seen so far from Ofcom demonstrates that Parliament needs to be doing more to ensure that its will is stamped on the regulatory framework that Ofcom has been forming.

There are many areas where we need to go further. One of the most concerning trends online that we have witnessed has been the rise of extremist misogyny and a culture that incites violence against women and girls more generally. Last year, 77% of girls and young women aged seven to 21 experienced online harm; that includes things such as revenge porn, which affects one in 14 adults. The revenge porn helpline has experienced an average 57% increase in cases each year since it was founded a decade ago. It has also witnessed a 400% rise in cases involving deepfake images. AI is powering today’s misogyny and abuse and more must be done.

That is why I have been campaigning for a ban on nudification apps that create deepfake pornography, by and large, of women and girls without their consent. Issues such as those need to be tackled now and not stewed over for another decade. I am concerned that Ofcom’s age assurance and children’s access codes of practice for part 5 providers—that is, dedicated pornography sites—do not include a clear and measurable definition of what highly effective age assurance means in practice. Without a stringent definition, pornography sites will likely shirk responsibility for implementing a robust system, and Ofcom’s ability to enforce action will be made more difficult. Moreover, we know that the Act did not look at content regulation. That is why we are all eagerly anticipating Baroness Bertin’s pornography review, which I believe is due to be published this week by the Government. Ensuring that online content is aligned with that of offline, regulated by the British Board of Film Classification, will be key.

We must look to expand age assurance to the level of the app store. App stores were not included in the Online Safety Act. Indeed, Ofcom has been given two years to conduct a review into app stores. I strongly believe that that needs to be brought forward. App stores are not adequately ensuring that apps are age-appropriate, and more needs to be done to stop children downloading apps that can lead them to dark and harmful places. As a Parliament, we must be willing to bring forward legislation that complements and builds on the Online Safety Act, to ensure that Ofcom acts to protect our women and children.

09:55
Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Stringer. I congratulate my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) on securing this vital debate. When he introduced the online safety White Paper in 2019, it was because we could not rely on big tech to regulate what was hosted on their platforms; it simply wasn’t working. Under the previous Government, we saw the tragic death of Molly Russell in 2017 and the complete failure of tech firms to adequately police illegal content on their sites, let alone the lawful but awful content that was being fed to our children from dawn till dusk.

Here we are, six years later, to discuss how the Online Safety Act is being implemented. In the meantime, virtually every Minister who has held the baton for this issue, including myself for a couple of years, has used this piece of legislation as almost a silver bullet for every harm that is encountered in the online world. I have often said that if ever there was a piece of legislation for which the phrase, “We mustn’t let the perfect be the enemy of the good” was invented, it is this one. We now need to hit the ground running and ensure that the legislation is implemented fast and effectively, in line with the sentiment that gave rise to it, as my right hon. and learned Friend the Member for Kenilworth and Southam suggested. Every day that Ofcom does not enforce its age assurance requirements for porn providers and illegal harms codes is a day that young children across the country are at serious risk of having their childhood stolen.

The Online Safety Act was a complicated and groundbreaking piece of legislation. No other Government in the world at the time had attempted to regulate the internet so effectively. I was pleased that when the Bill came back from the House of Lords, it was not just the size of the platforms that was taken into account when deciding the category of service, but the level of risk they represented, which is also really important. It is important to recognise that other countries and the EU have legislated while we have refined, and now we need to act.

I am glad that since the Act was passed in October 2023, Ofcom has worked at pace to bring forward codes on areas such as children’s safety duties, illegal harms and age assurance, which will have a massive and tangible impact. Ofcom intends to consult on further proposals to strengthen the codes this spring, and it is really important that that focuses on the issues we are seeing, such as hash-matching for terrorist and intimate image abuse content. That is particularly important considering the emergence of deepfakes as the new front in the war against women and girls—99% of pornographic images and deepfakes are of women.

In the light of this increasingly agile, polarising and inventive online world, I am concerned by reports in the media that the Government have decided to put the drive to keep protections up to date with tech developments on ice. There are reports in The Telegraph that Elon Musk is pushing for the Act to be watered down as part of a bargain to avoid trade tariffs. We are all looking for reassurance that, after so many years of work on this legislation by so many people, the Government will not water down or somehow filter its protections.

The Government have acknowledged that there has been an increase in suicides among young people, with suicide-related internet use found in 26% of deaths in under-20s. They made a manifesto commitment to build on this Act, and they must not row back on that. We cannot give up the fight to make the digital world a more pleasant and user-friendly place. We must never forget that if internet companies were doing what they say they are to implement their own terms and conditions, this legislation would not even be necessary, and the Government need to hold them to account.

09:55
Gregor Poynton Portrait Gregor Poynton (Livingston) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Mr Stringer. My congratulations to the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) on securing this important debate.

Online safety and the wellbeing of our children and young people in digital and online spaces are issues that guide many of us in the House, across the parties, and across the country. I speak only on my own behalf, but as chair of the all-party parliamentary group on children’s online safety, I believe that the Online Safety Act is landmark legislation that has the potential to transform the safety of children and young people in the online world and I applaud the Government’s commitment to creating the safest possible environment for our children, especially in the face of the growing dangers that lurk in the online space.

The Act is designed to tackle the pervasive issues of child sexual abuse material and online grooming. With provisions such as the requirement for platforms to scan for known child sexual abuse material, it has the potential to reduce significantly the availability of such content. Platforms will now have a legal obligation to take action, including by adopting measures such as hash matching, which will prevent the sharing of known CSAM. This is a major step forward and will undoubtedly save countless children from exploitation.

However, there are some concerns that I wish to raise to ensure that the full potential of the Act is realised. Hon. Members have raised many of them already, but I hope that this will give weight to them, and I hope that Ofcom will be listening to our concerns about the Act’s implementation. One of the most pressing issues raised by experts, including the Internet Watch Foundation, is the interpretation of “technically feasible” in Ofcom’s illegal harms codes. Although the Act requires platforms to take steps to remove illegal content, the codes suggest that services are obliged to do so only when that is deemed technically feasible. That could lead to a situation in which platforms, rather than taking proactive steps to safeguard users, simply opt out of finding innovative solutions to prevent harm.

I do not believe that that is the ambitious, risk-based regulatory approach that Parliament envisaged when it passed the Online Safety Act. These are the same platforms that have spent billions of pounds on R&D developing highly sophisticated algorithms to solve complex technical problems, and effectively targeting ads to drive revenue and serve audiences content that they want to see. They have a global reach: they have the tools, the people and the budgets to solve these problems. Therefore, we must ensure that platforms are incentivised to go beyond the bare minimum and truly innovate to protect our children. I echo the calls from multiple civil society organisations working in this area for us to require platforms to take a safety-by-design approach.

Another serious concern is the potential for platforms to use the safe harbour provision offered by the Act. That would allow companies to claim that they are compliant with the codes of practice, simply by following the prescribed rules and without necessarily addressing the underlying harms on their platforms. As the Internet Watch Foundation has rightly pointed out, it risks leaving platforms operating in a way that is compliant on paper but ineffective in practice.

I also ask Ofcom to look more quickly, as my hon. Friend the Member for Lowestoft (Jess Asato) has suggested, at Apple and Google’s app stores. They have a wealth of data and can be effective gamekeepers, particularly on age verification, if they are pressed into service. Finally, I encourage the Government and Ofcom to address more fully the issue of private communications. Many predators exploit private messaging apps to groom children, yet the Act’s provisions on private communications are limited. It is vital that we ensure that private spaces do not become safe havens for criminals and that platforms are held accountable for the spread of CSAM, regardless of whether that occurs in private or public spaces.

I hope that my hon. Friend the Minister can address those points in her response and that they will be kept front of mind by Ofcom, the Government and the tech giants as we all seek to ensure that digital and online spaces, which are increasingly important in all our lives, are safe and secure for our children and young people.

10:03
Monica Harding Portrait Monica Harding (Esher and Walton) (LD)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Stringer. I thank the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) for organising this important debate and for his continued work scrutinising this legislation.

The Online Safety Act was a landmark step towards making the internet a safer place, particularly for our children, but its implementation has fallen far short of what Parliament intended, hampered by Ofcom’s slow pace and limited ambition. Initially, the Act was designed to ensure that tech companies take responsibility for protecting users, especially children, from harmful content, but the current approach taken by Ofcom undermines that intent in several ways.

We have waited more than a year for Ofcom to complete its consultation on the illegal content codes of practice, but those codes fail to enforce a robust safety-by-design approach. Instead of proactively mitigating risks, many of its measures focus only on responding to harm after it has already occurred, and the children’s safety codes, which are still in draft, appear to follow a similarly disappointing trajectory. Features such as livestreaming, ephemeral content and recommender algorithms—tools that are frequently exploited for the purpose of online abuse—are also not meaningfully addressed in the current framework.

The Act has significant shortcomings in that it also allows companies to be deemed compliant simply by following Ofcom’s codes, regardless of whether their platforms remain unsafe in reality. This means that tech giants are permitted to hide behind a regulatory shield rather than being forced to address known risks on their platforms; all the while, children continue to be exposed to harm. The Act also explicitly requires protections tailored to different age groups, but in its implementation of it, Ofcom treats a seven-year-old and a 17-year-old as if their online safety needs are identical. In doing so, it has fundamentally failed to recognise how children’s development affects their online experiences and their vulnerabilities.

The action on fake and anonymous accounts has been slow and weak. This was a huge area of focus for parliamentarians before the Act was passed, and Ofcom itself identified it as a major risk factor in crimes such as terrorism, child sexual exploitation, harassment and fraud. As we approach 18 months since the passage of the Act, there has been no change for UK users. Instead of prioritising verification measures, Ofcom has pushed them to a later phase of implementation, delaying real action until at least 2027. That is unacceptable, especially when Ofcom’s own research shows that over 60% of eight to 11-year-olds are active on social media, despite existing age restrictions prohibiting it.

The Government’s and Ofcom’s delays in introducing user identity verification measures are unacceptable. The harms associated with fake and anonymous accounts are deeply personal and painfully real, with millions of Britons suffering from online abuse, scams and harassment each year. I hope the Minister can provide a robust explanation for the timidity and delay, and rule out any suggestion that the delays were a result of lobbying pressures from platforms. The best assurance she could give today would be a commitment that the introduction of verification measures will be brought forward to 2026, so that UK internet users are better protected.

In short, I ask the Minister to recognise the urgency of taking the following action. Ofcom must revise its codes to require proactive risk mitigation; tech companies should not be allowed to claim compliance with the regulatory framework, all the while continuing to expose users to harm; platforms must be held accountable if they fail to meet the real safety standards; and protections need to be specific to different age groups, so that younger children and teenagers receive appropriate levels of safety and access.

10:07
Alistair Strathern Portrait Alistair Strathern (Hitchin) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Mr Stringer. I thank the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) for securing this important debate. His contribution highlighted why he will continue to be an important voice as we go forwards as a Parliament in doing everything we can to keep young people safe online.

For a long time now, Parliament has regulated to keep young people safe from a whole host of harms, which are often tangible and physical. The precautionary principle has been front and centre of our efforts—almost to a fault sometimes, people might argue—to keep young people safe from harms that they simply should not be exposed to. When we look at online harm, however, it is clear that the precautionary principle has not always been there.

There is a range of reasons for that. I hope hon. Members will not mind me highlighting that, for many of us, the online world was not quite such a big presence in our lived experience growing up. Therefore, when it comes to legislating for the online world, the more recent nature of some of the developments means that the evidence base is inherently slightly more limited. We have to be confident in the principled, risk-based approach to acting, and act when we know it is right to do so.

We have to know that more urgent action in this space is the right thing to do. It is impossible not to be moved by the testimony of parents who have gone through some of the most heartbreaking tragedies as a result of our historical inaction, just as it is impossible for me not to be stirred to act when I visit schools and pupils of all ages consistently raise their own fears and concerns about what they are being exposed to online and its impact on them and their mental health.

Other Members have rightly highlighted some of the shortcomings of the Online Safety Act, but, as the right hon. and learned Member for Kenilworth and Southam pointed out, it is important to note the urgency of using the tools available to us now, given our historical inaction. We must ensure that we have the strongest possible implementation of the Act, which means that the strongest possible children’s code from Ofcom will be front and centre.

As other colleagues have highlighted, there is a whole host of ways in which Ofcom has been far too conservative and limited in its interpretation of the powers that Parliament has given it in bringing forward the children’s code, as well as its wider approach to the Act. As 5rights and others have highlighted, the approach of focusing purely on content, rather than on design and features, means that a whole host of harms, which are explicitly called out in the Act, are not affected.

There is nothing more tragic than the story of Molly Rose. The foundation set up in her name is very clear on the role that algorithms, doom spiralling, and young people consistently being pushed towards some of the most harmful content for them at their age played in what happened to her, and to far too many young people right across the country. In section 11(6)(f) of the Act, Parliament very explicitly made it clear that those features should be considered. Ofcom needs to make sure that that is brought forward, and that the code explicitly considers how technology companies can ensure that safety of features and design is considered right across the age range.

Alongside that, Internet Matters and many other groups have been really clear in pointing out that the current approach to age appropriateness—the flattening when it comes to people over and under 18—and the weak guidance on age verification risks not doing justice to Parliament’s very clear steer in section 12 that content and features should be considered from a risk-based perspective right across the age range. Again, that is a clear area where I think Ofcom could and should do a lot more.

As others including the IWF have pointed out, while some consideration of technical feasibility is obviously needed, the carve-out, as currently drafted, risks being an opt-out and a dilution of the ambition of tech companies in stepping up to the plate and making sure they are playing their part in keeping young people safe online.

There is a lot more we will need to do, and I have no doubt that the curriculum review—that is a separate matter—will be important in making sure we are playing our part in empowering young people to feel more confident and safe in these spaces. I am very glad to be doing this work in a Parliament where there are so many strong voices on this issue. Given its urgency, I really do hope that we can make progress between now and the upcoming children’s code to ensure that we are meeting the need of this moment fully.

10:11
Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Mr Stringer. I congratulate the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) on introducing the debate. I thank him for all that he has done over the years. We all recognise that. His deep interest in the subject matter was illustrated by the way he set the scene with lots of effect—not that anybody else did not, but he did it exceptionally well.

The Office for National Statistics revealed that 83% of 12 to 15-year-olds now own a smartphone with full internet access. They use them for school, and parents use them to keep an eye on their children through location services. There is a world of good that can be done with a phone; however, we are all aware that there is also a world of harm. When I was a boy, the bullies’ power left them when we left the school gates; now, their reach is vastly extended, and children’s mental health is the price to be paid.

I have spoken on many occasions in the House on this issue and on the Act, and I believe that we absolutely need a new, safe online world for our children. Cyber-bullying, grooming and online exploitation are real. As I highlighted in November, in the last debate on this topic, the Police Service of Northern Ireland revealed that in 2023, crimes involving children being contacted online by sexual predators rose by nearly a third in Northern Ireland. That is a very worrying figure. The scale of this issue is astronomical. I think of how vulnerable and precious our children are, and my heart aches at the number of children whose innocence has been taken from them at an early age. The joy of childhood comes from the magic of innocence, and anyone who takes that, whether by touch or online, is guilty of a crime. The entire purpose of the Act is to protect children, and we must see its full implementation.

More than three quarters of people saw self-harm content online for the first time at the age of 14 or younger, and individuals with a history of self-harm report being 10 years old or younger when they first viewed such content. Without very strict controls, children of any age can view things that simply are not appropriate for their wee minds. I am a great believer that it is parents’ job to do all they can to provide for their child: the love, safety, food, and clothing. That is harder than ever to do in a world that parents cannot access.

I speak as a grandparent who does not have the ability to do the things that others can do. I know that there is this unlimited world of access to unknown things. I am thankful that back home, the Minister of Education, my colleague Paul Givan, is attempting to send the message that online access needs to be curtailed, by investing in a pilot scheme for pouches that children put their phones into while in school. That prevents online access, and it means less distraction too. More than that, it ensures that children begin to learn that their phone does not need to be at their fingertips or at their ear. In fact, perhaps we adults need to remember that as well. Let us be honest: at Prime Minister’s questions, when we look across the Chamber, what will we all be doing? Probably looking at our phones. We should not be doing that; we should be concentrating on the Chamber. The most important thing is the message being sent to children—hopefully it is something that they can take into their working lives, too—that they can switch these things off and learn to reconnect with the real world in front of them. I congratulate the Northern Ireland Minister for doing that.

I commend the right hon. and learned Member for Kenilworth and Southam for the continued and solid work that he has put into this legislation. Children throughout the United Kingdom of Great Britain and Northern Ireland will be safer and happier for it. I often feel we have one job as a parent: to protect our children and their future. This legislation will hopefully play a part in helping parents to protect the most treasured part of their life, and I will always support that.

10:15
Lola McEvoy Portrait Lola McEvoy (Darlington) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Stringer. I pay tribute to the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) for his exceptional work and for his collegiate approach to this issue. In the interests of time, I will dive straight into the detail of what Ofcom is at risk of failing on in the implementation of its children’s safety codes.

As a trade union organiser, I know more than most about risk assessments and how they can be used in practice to protect people. A static risk assessment, as is required by the Act, will be used to assess the risk at that point in time; there will be a legal requirement to update or check that assessment within a year of its first iterance. A static risk assessment will assess the risk broadly, and if the online platforms adhere to the assessment, they will be in keeping with the legislation and will be given safe harbour, as has already been covered. That is not sufficient for the cohort of people using the platform at this time. The protection of children codes that are being published in April must require the use of a dynamic risk assessment.

Dynamic risk assessment is used by the Ministry of Defence, the NHS and several other work environments where the cohort they work with is vulnerable or at risk of injury or harm, and/or where the staff are at risk of injury from the work they do. Dynamic risk assessments are updated in real time. If the risk cannot be mitigated in real time, the activity must be stopped. I cannot fathom why these assessments are not being incorporated in the first iterance of the children’s codes. They would require the platforms to act in real time when they see children coming to harm, engaging in harmful behaviours or being exposed to harmful content. We know that myriad problems will arise when the codes are implemented. I believe strongly that if a dynamic risk assessment is included for those who say that they have children on their platforms, children will be safer in real time.

This is important not only because a dynamic risk assessment is enhanced, but because it makes sure that there is a point person responsible for that work. A point person at the platforms is already included in the Online Safety Act, responsible for being in touch with the Government and Ofcom and for implementing the measures in the Act. A DRA would mean that there was a responsible point person looking in real time to protect children. That is the first point.

I have several other points to make, but only a tiny amount of time. First, it is clear to me that functionalities should be included in the scope of the Act. I have spoken to Ofcom and to the platforms about it. The platforms are already including functionalities in their preliminary risk assessments, so their reading of the Act is that functionalities must be included. If they are going further already, I do not know why Ofcom would not stipulate that they continue to do so. Ofcom’s desire to include a toggle on and off mechanism for some of the functionalities is not sufficient to protect children because, as many of us who have been involved in these debates for a long time know, children will just switch them on. It is not sufficient to have a default off option either.

I will also touch on Jools’ law. As we have previously discussed in the Chamber, we need an amendment to make sure that in the tragic event of a child's death, a notice is automatically issued to the regulated online platforms to freeze the child’s accounts to protect them from deletion and to protect the data for the families going through an inquest. I pay tribute to the bereaved families who have worked on this. Finally, on timing, we have heard that any changes to the codes will delay implementation. I do not agree with that.

10:19
Bobby Dean Portrait Bobby Dean (Carshalton and Wallington) (LD)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Stringer. Social media has the power to provide spaces for connection, free speech and content creation that were unimaginable just a few decades ago. I remember what it was like to be a part of the first generation of teenagers to use social media. I hear the likes of MSN, Myspace and Bebo are no longer a thing among the youth, but I understand the joy of platforms like them and why we would not want our parents involved and snooping around on them. None the less, exactly how much space and freedom we should afford teenagers as parents and as society is the subject of intense debate.

When I speak to parents or teachers about social media, they tell me that they are concerned about how much time children spend on their devices, who they are speaking to and the fear that comes from not knowing what they are watching and reading. That is no surprise, because we as adults are struggling on the same platforms in the same way, and there is very little reassurance that the experience that young people get is much different from our own. The violence, the pornography, the hate—we all see it, and they see it too.

Just a few weeks ago, there was a horrific stabbing in my constituency involving a teenage boy. The video was posted all over social media within minutes. It kept popping up in my feed on Facebook, as it was shared across local groups, and I was tagged in the video on X. The video depicted the whole scene, unfiltered, without a warning. My thoughts went to the victim’s family and to the young teenagers at the college around the corner, who I am sure will have been watching it, too. I do not think I am imagining it when I say that, not long ago, a video like that simply would not have got around as quickly or been seen as frequently as that one. It would have been taken down, at least eventually, but with the purposeful rolling back of moderation by giants like Meta and X, violent content is not just becoming more frequent, it is becoming normalised.

How have we got here? Ultimately, it is because we have allowed the tech giants to become too powerful, with regulation arriving too slowly and without enough teeth. Once upon a time, the greatest minds took up careers in law and medicine, but now the big money and prestige is in big tech, an industry that, on the face of it, sells us nothing, but while we do not pay for their services with money, we pay for it with our attention. The longer they can keep us looking at their platforms; the more ads we see, and the more money they make, so we have the world’s most talented people working out the circuitry of our brains and creating products that are, by design, addictive. What we look at does not matter, only that we are looking, so there is no inherent commercial incentive to fix the problem of dangerous and harmful content.

Just imagine if all that energy and talent was directed into fail-proof age verification, taking fake accounts down, and other safety-by-design measures. Tough law and regulation is our only answer. The concern expressed in this debate is that the Online Safety Act was watered down on its way through Parliament, and further weakened by Ofcom’s guidance; my fear now is that it is under further threat, as in trade negotiations with the US this tech bro-fuelled Trump presidency may demand a further weakening.

As it stands, small companies are already off the hook. It does not matter how harmful the content is as long as its user space is small. The large companies have the legal representation and increasing soft power in practice to avoid compliance, and we are already seeing the consequences of that. Will the Government give us assurances in this debate that, as the mood music in America is to backslide on protections, the UK will stand strong? Will the Government commit to do the opposite of backsliding, to engage with children’s charities and other campaigners who have deep concerns about the gaps in the existing legislation and regulation by Ofcom, and to work to strengthen those protections further in the coming year?

It sounds very obvious, but the kids of today will soon become adults. The world that surrounds them as children will shape their views as adults. One of the most depressing things I have read recently is that teenage girls are the group most likely to be victims of domestic abuse. That is attributed in part to the rise of misogynistic content. If we fail to get the most profitable companies in the world to act, we fail everybody.

10:23
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I thank you for chairing this debate, Mr Stringer, and I congratulate the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) on bringing this debate to Westminster Hall. It is a subject we have talked about many times.

I want to make a number of points. The first is about safety by design. Page 1 of the Act states that the internet should be “safe by design”, yet everything that has happened since in the Act’s implementation, from the point of view of both Ofcom and the Government in respect of some of the secondary legislation, has not been about safety by design. It has been about regulating specific content, for example, and that is not where we should be. Much as I was happy that the Online Safety Act was passed, and I was worried about the perfect being the enemy of the good and all that, I am beginning to believe that the EU’s Digital Services Act will do a much better job of regulating, not least because the Government are failing to take enough action on this issue.

I am concerned that Ofcom, in collaboration with the Government, has managed to get us to a situation that makes nobody happy. It is not helpful for some of the tech companies. For example, category 1 is based solely on user numbers, which means that suicide forums, eating disorder platforms, doxing platforms and livestreaming platforms where self-generated child sexual abuse material is created are subject to exactly the same rules as a hill walking forum that gets three posts a week. In terms of proportionality, Ofcom is also failing the smallest platforms that are not risky, by requiring them to come to a three-day seminar on how to comply, when they might be run by a handful of volunteers spending a couple of hours a week looking after the forum and moderating every post. It will be very difficult for them to prove that children do not use their platforms, so there is no proportionality at either end of the spectrum.

In terms of where we are with the review, this is a very different Parliament from the one that began the conversations in the Joint Committee on the Draft Online Safety Bill. It felt like hardly anybody in these rooms knew anything about the online world or had any understanding of it. It is totally different now. There are so many MPs here who, for example, have an employment history of working hard to make improvements in this area. As the right hon. and learned Member said, we now have so much expertise in these rooms that we could act to ensure that the legislation worked properly. Rather than us constantly having to call these debates, the Government could rely on some of our expertise. They would not have to take on every one of a Joint Committee’s recommendations, for example, but they could rely on some of the expertise and the links that we have made over the years that we have been embedded in this area to help them make good decisions and ensure some level of safety by design.

Like so many Members in this place, I am concerned that the Act will not do what it is supposed to do. For me, the key thing was always keeping children safe online, whether that is about the commitments regularly given by the Government, which I wholeheartedly believe they wanted to fulfil, about hash matching to identify grooming behaviours, or about the doxing forums or suicide forums—those dark places of the internet—which will be subject to exactly the same rules as a hill walking forum. They are just going to fill in a risk assessment and say, “No children use our platform. There’s no risk on our platform, so it’s all good.” The Government had an opportunity to categorise them and they choose not to. I urge them to change their mind.

10:28
Martin Wrigley Portrait Martin Wrigley (Newton Abbot) (LD)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Stringer. I congratulate the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) on securing this debate.

We have heard some consistent themes coming through. We have heard about Ofcom perhaps misinterpreting what the House intended with the Act. We have heard about the importance of the Ofcom code of practice, how it is constructed and how it drives online platforms’ behaviour. We have heard from the hon. Member for Stoke-on-Trent Central (Gareth Snell) about the importance of conformity across different platforms. We have heard that regulators might not be fulfilling the expectations of this House. We have also heard from the hon. Member for Gosport (Dame Caroline Dinenage) about lawful but awful content and about how we should not let the perfect be the enemy of the good.

I think there is a feeling that the Act does what it does, but that the interpretation has not been what was hoped for and that there is still much more to do. We heard from the hon. Member for Livingston (Gregor Poynton) about the “legal but feasible” loophole, and also about bringing in safety by design, which became a consistent theme throughout the rest of the conversations. My hon. Friend the Member for Esher and Walton (Monica Harding) talked about the design to protect children and the framework’s lack of mitigation on livestreaming, and said that seven-year-olds and 17-year-olds are treated the same. That is clearly not right.

The hon. Member for Hitchin (Alistair Strathern) impressed upon us the urgency and importance of the children’s safety codes. The hon. Member for Strangford (Jim Shannon) cited the astonishing fact that 83% of 10 to 15-year-olds have phones—that is an amazing proportion—and also mentioned cyber-bullying.

Other hon. Members spoke about other areas, but the same things came up. As a member of the Science, Innovation and Technology Committee and, until recently, a tribunal member with the telecoms regulator—that responsibility has now moved to Ofcom—I have seen the importance of the codes of practice and how long it takes to revise them. Thirty years in the telecoms industry showed me how tough age assessment can be. I have also spent time delivering app stores, but before the age of Google and Apple phones.

It is clear that the hard-won amendment to include smaller sites with harmful content has been lost through its exclusion from the statutory instrument. In the Bill Committee, the Minister said that we must do everything in our power, and that there is much more to do. We have heard a lot about what needs to be done, and we urge the Government to do it. We urge them to look again at the exclusion of small but harmful sites and to continue to look at how we can improve the implementation of safety by design.

10:31
Ben Obese-Jecty Portrait Ben Obese-Jecty (Huntingdon) (Con)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Stringer. I thank my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) for securing this timely debate. His wealth of knowledge on this topic is clear, and his voice in pursuing the most effective iteration of the legislation has been constant.

The previous Government passed the world-leading Online Safety Act, which places significant new responsibilities and duties on social media platforms and search services to increase child safety online—aims that all Members can agree upon. Platforms will be required to prevent children from accessing harmful and age-inappropriate content, and to provide parents and children with clear and accessible ways to report problems online when they arise.

The evidence base showing that social media is adversely impacting our children’s mental health is growing stronger. The Royal Society for Public Health says that about 70% of young people now report that social media increases their feelings of anxiety and depression. It is for those reasons that Conservative Ministers ensured the strongest measures in the Act to protect children.

The Act places duties on online platforms to protect children’s safety and put in place measures to mitigate risks. They will also need to proactively tackle the most harmful illegal content and activity. Once in force, the Act will create a new regulatory regime to significantly improve internet safety, particularly for young people. It will address the rise in harmful content online and will give Ofcom new powers to fulfil the role of the independent regulator. Fundamentally, it will ensure services take responsibility for making their products safe for their users.

I note that the Government have said that they are prioritising work with Ofcom to get the Act implemented swiftly and effectively to deliver a safer online world, but I recognise the concerns of parents and campaigners who worry that children will continue to be exposed to harmful and age-inappropriate content every day until these regulations come into force. Will the Minister acknowledge those concerns in her remarks?

The Act places new duties on certain internet services to protect users from illegal content on their platforms. The purpose of those illegal content duties is to require providers of user-to-user and search services to take more responsibility for protecting UK-based users from illegal content and activity that is facilitated or encountered via their services.

In December, Ofcom published its finalised illegal harms codes of practice and risk assessment guidance. The codes of practice describe the measures that services can take to fulfil their illegal content duties, and they recommend that providers of different kinds and with different capacities take different steps proportionate to their size, capacity and level of risk.

The codes recommend measures in areas including user support, safety by design, additional protections for children and content moderation or de-indexing. Many of the measures in the draft codes are cross-cutting and will help to address all illegal harms. Certain measures are targeted at specific high-priority harms, including child sexual abuse material, terrorism and fraud. Those include measures on automated tools to detect child sexual abuse material and for establishing routes so that the police and the Financial Conduct Authority can report fraud and scams to online service providers. The included measures will also make it easier for users to report potentially illegal content.

Ofcom has also published guidance on how providers should carry out risk assessments for illegal content and activity. Providers now have three months to complete their illegal content risk assessment. Can the Minister update the House on whether the completion of the risk assessments will coincide with the codes of practice coming into force?

Another important milestone was the publication of Ofcom’s children’s access assessment guidance last month. Services will have to assess whether their service is likely to be accessed by children and, once the protection of children codes have been finalised by the summer, must put in place the appropriate protections, known as age assurance duties.

All services that allow pornography must implement by July at the latest highly effective age assurance to ensure that children are not normally able to access pornographic content. Together, the illegal harms and child safety codes should put in place an important foundation for the protection of users. For example, children will be better protected online with services having to introduce robust age checks to prevent children seeing content such as suicide, self-harm material and pornography, and having to tackle harmful algorithms. Illegal content, including hate speech, terrorist content and content that encourages or facilitates suicide should be taken down as soon as services are aware of it. Women and girls will be better protected from misogyny, harassment and abuse online.

The Government have said they are keen for Ofcom to use its enforcement powers as the requirements on services come into effect to make sure that the protections promised by the Act are delivered for users. Samaritans has called on the Government and Ofcom to

“fully harness the power of the Online Safety Act to ensure people are protected from dangerous content”.

Will the Minister confirm that the Government will fully back Ofcom in its enforcement of the illegal harms and child safety codes?

There are concerns that Ofcom appears to be relying on future iterations of the codes to bring in the more robust requirements that would improve safety. Relying on revision of the codes to bring them up to the required standard will likely be a slow process. The requirement to implement initial codes and guidance is significant and is unlikely to allow capacity for revision. Furthermore, the Secretary of State’s ability to stipulate such revisions could hamper that. To that end, it is essential that the first iteration of the codes of practice is robust enough to endure without the need for revision in the short term. Although that might be difficult to achieve in an environment that moves as quickly as the digital space, it must be strived for, lest we end up with legislation that does not hold online platforms to account and does not protect victims of online harms as it should.

As legislators, we have a responsibility to ensure that the online world is a safe place for our children. We also have a responsibility to ensure that online platforms take their obligations seriously. I am pleased that the previous Government’s Online Safety Act delivers on both those points. I urge the Minister to ensure that it is fully implemented as soon as possible.

Graham Stringer Portrait Graham Stringer (in the Chair)
- Hansard - - - Excerpts

We have gained a considerable amount of time because of disciplined interventions and short speeches. I ask the Minister to ensure that there is a small amount of time at the end for the Member in charge to wind up.

10:37
Feryal Clark Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Feryal Clark)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Stringer. I thank the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) for securing this debate on the implementation of the Online Safety Act. I know that he has been following the Bill throughout its passage and has been a critic of every Minister, even his Government’s Ministers, whenever the Bill was watered down or delayed, so I expect him to hold all of us to account. I am grateful to him and all the hon. Members who have spoken this morning. The Government share their commitment to keeping users safe online. It is crucial that we continue to have conversations about how best to achieve that goal.

The Online Safety Act lays the foundations for strong protections against evil content and harmful material online. It addresses the complex nature of online harm, recognising that harm is not limited to explicit content and extending to the design and functionality of online services. We know that the legislation is not perfect. I hear that at every such debate, but we are committed to supporting Ofcom to ensure that the Act is implemented quickly, as this is the fastest way to protect people online. 2025 is the year of action for online safety, and the Government have already taken a number of steps to build on Ofcom’s implementation of the Act. In November last year, the Secretary of State published the draft “Statement Of Strategic Priorities for online safety”. That statement is designed to deliver a comprehensive, forward-looking set of online safety priorities for the full term of this Government. It will give Ofcom a backing to be bold on specific areas, such as embedding safety by design, through considering all aspects of a service’s business model, including functionalities and algorithms.

We are also working to build further on the evidence base to inform our next steps on online safety, and I know that this issue was debated earlier this week. In December, we announced a feasibility study to understand the impact of smartphones and social media on children, and in the Data (Use and Access) Bill, we have included provisions to allow the Secretary of State to create a new researcher access regime for online safety data. That regime is working to fix a systemic issue that has historically prevented researchers from understanding how platforms operate, and it will help to identify and mitigate new and preventable harms. We have also made updates to the framework, such as strengthening measures to tackle intimate image abuse under the Online Safety Act, and we are following up on our manifesto commitment to hold perpetrators to account for the creation of explicit, non-consensual deepfake images through amendments to the Data Bill.

We are also building on the measures in the Online Safety Act that allow Ofcom to take information on behalf of coroners. Through the Data Bill, we are bringing in additional powers to allow coroners to request Ofcom to issue a notice requiring platforms to preserve children’s data, which can be crucial for investigations into a child’s tragic death. My hon. Friend the Member for Darlington (Lola McEvoy) raised Jools’ law, of which I am very aware, and I believe that she is meeting Ministers this week to discuss it further.

Finally, we recently announced that, in the upcoming Crime and Policing Bill, we are introducing multiple offences to tackle AI sexual abuse, including a new offence for possessing, creating or supplying AI tools designed to generate child sexual abuse material.

Members have raised the issue of the Act’s implementation being too slow. We are aware of the frustrations over the amount of time that it has taken to implement the Online Safety Act, not least because of the importance of the issues at hand. We are committed to working with Ofcom to ensure that the Online Safety Act is implemented as quickly and effectively as possible.

Lola McEvoy Portrait Lola McEvoy
- Hansard - - - Excerpts

On implementation, would the Minister give clarity about the watermark for re-consultation and the point of delay of implementing the children’s codes under the Act? Amendments could be made to the children’s codes and I do not think they would trigger an automatic re-consultation with platforms. Could the Minister elaborate on where the delay would come from and how much scope Parliament has to amend those codes, which will be published in April?

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

Ofcom has had to spend a long time consulting on the codes to ensure that they are as proofed against judicial review as possible. Any re-consultation or review of the codes will result in a delay, and the best way to ensure that we can protect children is to implement the Act as soon as possible. My hon. Friend referred to the fact that both Ofcom and the Secretary of State have said that this is not a done deal; it is an iterative process, so of course we expect those codes to be reviewed.

As I said, Ofcom is moving forward with implementation of the Act. In a matter of weeks we will start to see, for the first time, safety duties making a material difference to online experiences for adults and children. Platforms are already duty-bound to assess the risk of illegal content and, with a deadline of 16 March, to complete risk assessments. Once legal harm codes come into effect from 17 March, Ofcom will be able to enforce legal safety duties. Shortly following that in April, Ofcom will publish the child safety codes and associated guidance, starting the clock for services to assess the risk of content harmful to children on their platforms. The child safety duties should be fully in effect by the summer.

My hon. Friend the Member for Darlington also raised the issue of dynamic risk assessment. I understand that she is in conversation with Ofcom and Ministers on that. I will await the outcome of those discussions. The implementation of the Act will bring in long overdue measures, such as preventing children from accessing pornography and legal content encouraging suicide, self-harm or eating disorders.

I have heard concerns raised by hon. Members regarding Ofcom’s approach, particularly to harmful functionalities and safety by design. We understand there is still a lot of work to be done, which is why the Secretary of State’s statement of strategic priorities places a high importance on safety by design. However, it is important not to lose sight of the positive steps we expect to see this year under the Act. For instance, Ofcom’s draft child codes already include specific measures to address harmful algorithms, among other safety recommendations. We expect Ofcom will continue to build on those important measures in the codes.

Questions were asked about whether the Government have plans to water down the Act. I can categorically state that there are no plans to water down the measures. The Secretary of State has made it very clear that any social media company that wants to operate in our society will have to comply with the law of the land. Whatever changes are made in other jurisdictions, the law of the land will remain.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

The Minister might be about to come to the point I want to raise with her, which is about proportionality. Will she say something about that? I am keen to understand whether the Government accept Ofcom’s understanding of the term—that proportional measures are those measures that can be evidenced as effective. I gave reasons why I am concerned about that. I want to understand whether the Government believe that that is the correct interpretation of proportionality.

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

I was about to come to the point that the right hon. and learned Member raised about the digital regulation Committee. I have had a brief conversation with him about that, and agree about the importance of parliamentary scrutiny of the implementation of the Online Safety Act. I welcome the expertise that Members of both Houses bring. Select Committees are a matter for the House, as he is aware.

We will continue to work with the House of Lords Communications and Digital Committee and the House of Commons Science, Innovation and Technology Committee to support their ongoing scrutiny, as well as other parliamentary Committees that may have an interest in the Act. The Act requires the Secretary of State to review the effectiveness of the regime, two to five years after the legislation comes into force. We will ensure that Parliament is central to that process. I encourage the right hon. and learned Member to continue to raise the matter with the right people.

Most hon. Members raised the issue of apps. Ofcom will have a duty to publish a report on the role of app stores and children’s accessing harmful content on the apps of regulated services. The report is due between January ’26 and January ’27. Once it is published, the Secretary of State may, if appropriate, make regulations to bring app stores into the scope of the Act. The timing will ensure that Ofcom can prioritise the implementation of child safety duties. I will write to the right hon. and learned Member for Kenilworth and Southam on the issue of proportionality, as I want to ensure that I give him the full details about how that is being interpreted by Ofcom.

We fully share the concerns of hon. Members over small platforms that host incredibly harmful content, such as hate forums. These dark corners of the internet are often deliberately sought out by individuals who are at risk of being radicalised.

Martin Wrigley Portrait Martin Wrigley
- Hansard - - - Excerpts

If the Government fully support our concerns about small but harmful sites, will the statutory instrument be reworked to bring them back into category 1, as the Act states?

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

The Government are confident that the duties to tackle illegal content and, where relevant, protect children from harmful content will have a meaningful impact on the small but risky services to which the hon. Gentleman refers. Ofcom has created a dedicated supervision taskforce for small but high-risk services, recognising the need for a bespoke approach to securing compliance. The team will focus on high-priority risks, such as CSAM, suicide and hate offences directed at women and girls. Where services do not engage with Ofcom and where there is evidence of non-compliance, Ofcom will move quickly to enforcement action, starting with illegal harm duties from 17 March, so work is being done on that.

The comprehensive legal safety duties will be applied to all user-to-user forums, and child safety duties will be applied to all user-to-user forums likely to be accessed by children, including the small but high-risk sites. These duties will have the most impact in holding the services to account. Because of the deep concerns about these forums, Ofcom has, as I said, created the small but risky supervision taskforce. For example, Ofcom will be asking an initial set of firms that pose a particular risk, including smaller sites, to disclose their illegal content risk assessment by 31 March.

The Government have been clear that we will act where there is evidence that harm is not being adequately addressed despite the duties being in effect, and we have been clear to Ofcom that it has the Government’s and Parliament’s backing to be bold in the implementation of the Online Safety Act. We are in clear agreement that the Act is not the end of the road, and Ofcom has already committed to iterating on the codes of practice, with the first consultation on further measures being launched this spring. The Government remain open minded as to how we ensure that users are kept safe online, and where we need to act, we will. To do so, we must ensure that the actions we take are carefully considered and rooted in evidence.

Lola McEvoy Portrait Lola McEvoy
- Hansard - - - Excerpts

Will the consultation this spring for the next iterations of the codes include consultation with parliamentarians, or is it solely with platforms?

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

I expect any consultation will have to go through the Secretary of State, and I am sure it will be debated and will come to the House for discussion, but I will happily provide my hon. Friend with more detail on that.

I am grateful to all Members for their contributions to the debate. I look forward to working with the right hon. and learned Member for Kenilworth and Southam, and hopefully he can secure the Committee that he has raised.

Monica Harding Portrait Monica Harding
- Hansard - - - Excerpts

Can the Minister explain what she meant when she said that Ofcom had to ensure that the codes were as judicial review-proofed as possible? Surely Ofcom’s approach should be to ensure that the codes protect vulnerable users, rather than be judicial review-proofed.

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

The point I was trying to make was that Ofcom is spending time ensuring that it gets the codes right and can implement them as soon as possible, without being delayed by any potential challenge. To avoid any challenge, it must ensure that it gets the codes right.

10:56
Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

I am grateful to everyone who has spoken in the debate. We have talked about the consensus there was in the passage of the Online Safety Bill. I think it is fair to say that that consensus is broadly still present, based on what Members have said this morning, and I am grateful for it.

There is a need to get this Act implemented. I accept what the Minister says about that, and others have made the same point: we do not want to make the best the enemy of the good, and there is always a trade-off between, on the one hand, getting the particular mechanisms that we know will protect people online in place as swiftly as possible, and on the other hand, making them as extensive and effective as possible.

However, given how long it takes for Parliament to make change—I make no apologies for repeating this point—we need to make the best use of the legislation that we have. I have not made a case this morning for extending the parameters of the legislation; I have made a case for using the parameters we already have, which Parliament has already legislated into being and which we have passed over to the regulator for it to use.

I accept that regulation and legislation is not passed for effect; we do it so that it can work. We do it not to make ourselves feel better, but to make the lives of our constituents better, so the Minister is right to say that the usability of all this should be at the heart of what we are interested in. I accept the point made by the hon. Member for Esher and Walton (Monica Harding) that Ofcom should not be predominantly focused on insulating itself from judicial review. As a former Law Officer, I think that is an impossible task anyway. This legislation and the regulation that follows it will be challenged—the online platforms have every incentive to challenge it. We cannot be so terrified of that prospect that we are unwilling to extend the parameters of the regulation as far as we believe they should go. That is why I think everybody needs to be a tad braver in all this.

Finally, I simply want to repeat the point that many of us have made, which is that we need as Parliament to have a way of keeping our eye on what is happening in this space. These debates are great, but shouting at Ofcom through the loudhailer of Westminster Hall is not as effective as a Committee set up to do this in a more structured and, frankly, a more productive and consensual way. That is the gap that exists in the landscape of parliamentary oversight, and as we develop more and more digital regulation, as we have to, and as AI advances, we will have to fill that gap. I simply say to the Government that filling it sooner rather than later would be wise.

Question put and agreed to.

Resolved,

That this House has considered the implementation of the Online Safety Act 2023.