(2 days, 1 hour ago)
General CommitteesI beg to move,
That the Committee has considered the draft Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2024.
As ever, Mr Dowd, it is a joy to see you in your seat and, as usual, in a very fine suit. The regulations we are discussing today were laid before the House on 12 September. In our manifesto, the Labour party stated that we would use every Government tool available to target perpetrators and address the root causes of abuse and violence, in order to achieve our landmark mission to halve violence against women and girls in a decade. I am sure the whole Committee would agree with that. Through this statutory instrument, we are broadening the responsibilities of online platforms and search services to tackle image abuse under the Online Safety Act 2023.
As I am sure all members of the Committee will know, the Online Safety Act received Royal Assent on 26 October 2023. It places strong new duties on online user-to-user platforms and on search engines and search services to protect their users from harm. As part of that, the Act gives service providers new illegal content duties. Under these duties, online platforms need to assess the risk that their services will allow users to encounter illegal content or be used for the commission or facilitation of so-called priority offences. They then need to take steps to mitigate any identified risks. These will include implementing safety-by-design measures to reduce risks, and content moderation systems to remove illegal content where it does appear. The Online Safety Act sets out a list of priority offences for the purposes of providers’ illegal content duties. These offences reflect the most serious and prevalent online illegal content and activity. The priority offences are set out in schedule 7 to the Act. Platforms will need to take additional steps to tackle these kinds of illegal activity under their illegal content duties.
Sections 66B, 66C and 66D of the Sexual Offences Act 2003, as amended by the Online Safety Act 2023, introduce a series of intimate image abuse offences. Today’s statutory instrument will add the offences to which I have just referred to the list of priority offences—the ones that the organisations must take action on. These offences include the sharing of manufactured or manipulated images, including deepfakes, and sharing images where the intent was to cause distress. This statutory instrument means that online platforms will be required to tackle more intimate image abuse. I hope that the Committee will support what we are doing here.
The new duties will come into force next spring, as the Act provides that Ofcom needs to be able to implement them within 18 months of Royal Assent. Ofcom will set out the specific steps that providers can take to fulfil their illegal content duties for intimate image abuse and other illegal content in codes of practice and guidance documentation. Ofcom is currently producing this documentation. The new duties will start to be enforced from spring next year, as soon as Ofcom has issued the codes of practice and they have come into force, because of the 18 months having passed. Providers will need to have done their risk assessment for illegal content by then. In other words, the work starts now.
We anticipate that Ofcom will recommend that providers should take action in a number of areas. These include content moderation, reporting and complaints procedures, and safety-by-design steps, such as testing their algorithm systems to see whether illegal content is being recommended to users. I am sure that all members of the Committee will be able to think of instances we have read about in the press that would be tackled by precisely this piece of legislation. I would say, because the shadow Minister will speak shortly, that we welcome the work of the previous Government on this. Where we can co-operate across the House to secure strong regulation that ensures that everybody is protected in this sphere, we will work together. I hope that is the tenor of the comments that the shadow Minister will make in a few moments.
Where companies are not removing and proactively stopping that vile material from appearing on their platforms, Ofcom has robust powers to take enforcement action against them, including the possibility of imposing fines of up to £18 million, or 10% of qualifying worldwide revenue—whichever is highest. Although this statutory instrument looks short, it is significant. We are broadening providers’ duties for intimate image abuse content. Service providers will need to take proactive steps to search for, remove and limit people’s exposure to that harmful illegal content, including where it has been manufactured or manipulated and is in effect a deepfake. I therefore commend these regulations to the Committee.
It is a pleasure to serve under your chairmanship, Mr Dowd. I am happy to confirm that the Opposition will support these regulations, not least because, as the Minister has said, they complement the previous Government’s work on the Online Safety Act, and I was the Minister responsible for implementing the Act from when it received Royal Assent until the general election.
I take great pride in having served in the Government that introduced and passed the Online Safety Act. It places significant new responsibilities and duties on social media companies, platforms and services to increase safety online. However, most importantly, this vital piece of legislation ensures that children are better protected online. Having just attended a roundtable where we listened to victims of online abuse, I know that that is more important than ever. The Minister will share my thoughts on that. I share his sentiment—the Opposition will work with the Government to make sure that victims of online abuse receive justice and are supported and protected.
It is worrying and sad that almost three quarters of teenagers between 13 and 17 have encountered one or more potential harms online, and that three in five secondary school-aged children have been contacted online in a way that potentially made them feel uncomfortable. It is for those reasons that we ensured that the strongest measures in the Online Safety Act protect children. For example, platforms are required to prevent children from accessing harmful and age-inappropriate content and to provide parents and children with clear and accessible ways to report problems online when they arise. Furthermore, the Act requires all in-scope services that allow pornography to use highly effective age assurance to prevent children from accessing it, including services that host user-generated content and services that publish pornography. Ofcom has robust enforcement powers available against companies that fail to fulfil their duties.
The Online Safety Act also includes provisions to protect adult users, as it ensures that major platforms are more transparent about which kinds of potentially harmful content they allow. It gives users more control over the types of content they want to see. I note that Ofcom expects the illegal harm safety duties to become enforceable around March 2025, following Ofcom’s publication of its illegal harm statement in December 2024. Does the Minister agree that platforms do not need to wait for those milestones, as I often said, and should already be taking action to improve safety on their sites? Can he confirm that he is encouraging platforms to take proactive action in advance of any deadlines?
Separately from the Online Safety Act, the last Government also launched the pornography review, which explores the effectiveness of regulation, legislation and the law enforcement response to pornography. Can the Minister provide a reassurance that the review’s final report is on schedule and will be published before the end of the year? Can he also clarify whether the review will consider the impact of violent and harmful pornography on women and girls? I would be grateful for the Minister’s comments on those points and for his co-operation throughout his tenure. I am happy to add our support to these regulations, and to see that the previous Government’s pivotal piece of legislation is making the UK the safest place in the world for a child to be online.
As I said, I welcome the hon. Gentleman. I hope he stays in his place—I do not mean that I hope he stays in the room for the rest of the day, though. It is good when people actually know something about the subject they are talking about in debates in the House, so it is good to have him still in his place. [Interruption.] I hope that is not a note from the Leader of the Opposition saying that he is no longer responsible for this area.
My speaking brief says: “I thank the members of the Committee for their valuable contributions to this debate”, but—well, anyway. The hon. Member made an important point about the protection of children. That is not precisely what this statutory instrument is about; it is about the requirements on platforms and search services to deal with intimate image abuse. That is the very specific thing we are tackling this afternoon. The pornography review is not what we are debating this afternoon either, but I am happy to write to him about that and hope to provide him with the assurances he seeks.
The hon. Member makes the most important point of all when he says that platforms do not have to wait until next March to take action in this field. I am sure that any parent or anybody else watching this part of society with even the slightest interest will know about the significant damage done to our whole social sphere in this area over the last few years. Platforms need to take their responsibilities seriously. They do not need to wait for Ofcom to tell them how to do it; they should be taking action now. They certainly need to make an assessment now, before next March or April, of where any risks are, because otherwise there is a danger that Ofcom will immediately take action against them, because it will say, “Sorry, you haven’t even done the basic minimum that you need to be able to make people safe.”
Everybody wants the online world to be as safe as the world that we all inhabit. The only way to do that is by making sure that the legislation is constantly updated. There are Members who often ask whether we want to update the Online Safety Act, because there are perhaps things that we might need to take further action on in future. We are very focused in the Department on trying to ensure that it is fully implemented in the shape that it is in now, before looking at new versions of the legislation in this field. But where, as in this case, we can take small, sensible measures that will make a significant difference, we are prepared to do that. That is the attitude that we are trying to adopt.
I hope the Committee agrees with me on the importance of updating the priority offences in the Online Safety Act as swiftly as possible, and I therefore commend the regulations to the Committee.
Question put and agreed to.