Draft Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2024 Debate
Full Debate: Read Full DebateChris Bryant
Main Page: Chris Bryant (Labour - Rhondda and Ogmore)Department Debates - View all Chris Bryant's debates with the Department for Science, Innovation & Technology
(1 month, 2 weeks ago)
General CommitteesI beg to move,
That the Committee has considered the draft Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2024.
As ever, Mr Dowd, it is a joy to see you in your seat and, as usual, in a very fine suit. The regulations we are discussing today were laid before the House on 12 September. In our manifesto, the Labour party stated that we would use every Government tool available to target perpetrators and address the root causes of abuse and violence, in order to achieve our landmark mission to halve violence against women and girls in a decade. I am sure the whole Committee would agree with that. Through this statutory instrument, we are broadening the responsibilities of online platforms and search services to tackle image abuse under the Online Safety Act 2023.
As I am sure all members of the Committee will know, the Online Safety Act received Royal Assent on 26 October 2023. It places strong new duties on online user-to-user platforms and on search engines and search services to protect their users from harm. As part of that, the Act gives service providers new illegal content duties. Under these duties, online platforms need to assess the risk that their services will allow users to encounter illegal content or be used for the commission or facilitation of so-called priority offences. They then need to take steps to mitigate any identified risks. These will include implementing safety-by-design measures to reduce risks, and content moderation systems to remove illegal content where it does appear. The Online Safety Act sets out a list of priority offences for the purposes of providers’ illegal content duties. These offences reflect the most serious and prevalent online illegal content and activity. The priority offences are set out in schedule 7 to the Act. Platforms will need to take additional steps to tackle these kinds of illegal activity under their illegal content duties.
Sections 66B, 66C and 66D of the Sexual Offences Act 2003, as amended by the Online Safety Act 2023, introduce a series of intimate image abuse offences. Today’s statutory instrument will add the offences to which I have just referred to the list of priority offences—the ones that the organisations must take action on. These offences include the sharing of manufactured or manipulated images, including deepfakes, and sharing images where the intent was to cause distress. This statutory instrument means that online platforms will be required to tackle more intimate image abuse. I hope that the Committee will support what we are doing here.
The new duties will come into force next spring, as the Act provides that Ofcom needs to be able to implement them within 18 months of Royal Assent. Ofcom will set out the specific steps that providers can take to fulfil their illegal content duties for intimate image abuse and other illegal content in codes of practice and guidance documentation. Ofcom is currently producing this documentation. The new duties will start to be enforced from spring next year, as soon as Ofcom has issued the codes of practice and they have come into force, because of the 18 months having passed. Providers will need to have done their risk assessment for illegal content by then. In other words, the work starts now.
We anticipate that Ofcom will recommend that providers should take action in a number of areas. These include content moderation, reporting and complaints procedures, and safety-by-design steps, such as testing their algorithm systems to see whether illegal content is being recommended to users. I am sure that all members of the Committee will be able to think of instances we have read about in the press that would be tackled by precisely this piece of legislation. I would say, because the shadow Minister will speak shortly, that we welcome the work of the previous Government on this. Where we can co-operate across the House to secure strong regulation that ensures that everybody is protected in this sphere, we will work together. I hope that is the tenor of the comments that the shadow Minister will make in a few moments.
Where companies are not removing and proactively stopping that vile material from appearing on their platforms, Ofcom has robust powers to take enforcement action against them, including the possibility of imposing fines of up to £18 million, or 10% of qualifying worldwide revenue—whichever is highest. Although this statutory instrument looks short, it is significant. We are broadening providers’ duties for intimate image abuse content. Service providers will need to take proactive steps to search for, remove and limit people’s exposure to that harmful illegal content, including where it has been manufactured or manipulated and is in effect a deepfake. I therefore commend these regulations to the Committee.
As I said, I welcome the hon. Gentleman. I hope he stays in his place—I do not mean that I hope he stays in the room for the rest of the day, though. It is good when people actually know something about the subject they are talking about in debates in the House, so it is good to have him still in his place. [Interruption.] I hope that is not a note from the Leader of the Opposition saying that he is no longer responsible for this area.
My speaking brief says: “I thank the members of the Committee for their valuable contributions to this debate”, but—well, anyway. The hon. Member made an important point about the protection of children. That is not precisely what this statutory instrument is about; it is about the requirements on platforms and search services to deal with intimate image abuse. That is the very specific thing we are tackling this afternoon. The pornography review is not what we are debating this afternoon either, but I am happy to write to him about that and hope to provide him with the assurances he seeks.
The hon. Member makes the most important point of all when he says that platforms do not have to wait until next March to take action in this field. I am sure that any parent or anybody else watching this part of society with even the slightest interest will know about the significant damage done to our whole social sphere in this area over the last few years. Platforms need to take their responsibilities seriously. They do not need to wait for Ofcom to tell them how to do it; they should be taking action now. They certainly need to make an assessment now, before next March or April, of where any risks are, because otherwise there is a danger that Ofcom will immediately take action against them, because it will say, “Sorry, you haven’t even done the basic minimum that you need to be able to make people safe.”
Everybody wants the online world to be as safe as the world that we all inhabit. The only way to do that is by making sure that the legislation is constantly updated. There are Members who often ask whether we want to update the Online Safety Act, because there are perhaps things that we might need to take further action on in future. We are very focused in the Department on trying to ensure that it is fully implemented in the shape that it is in now, before looking at new versions of the legislation in this field. But where, as in this case, we can take small, sensible measures that will make a significant difference, we are prepared to do that. That is the attitude that we are trying to adopt.
I hope the Committee agrees with me on the importance of updating the priority offences in the Online Safety Act as swiftly as possible, and I therefore commend the regulations to the Committee.
Question put and agreed to.