Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2024 Debate
Full Debate: Read Full DebateBaroness Jones of Whitchurch
Main Page: Baroness Jones of Whitchurch (Labour - Life peer)Department Debates - View all Baroness Jones of Whitchurch's debates with the Department for Business and Trade
(3 days, 16 hours ago)
Grand CommitteeThat the Grand Committee do consider the Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2024.
My Lords, these regulations were laid before the House on 12 September this year. The Government stated in their manifesto that they would
“use every government tool available to target perpetrators and address the root causes of abuse and violence”
in order to achieve their
“landmark mission to halve violence against women and girls in a decade”.
Through this statutory instrument, we are broadening online platforms’ and search engines’ responsibilities for tackling intimate image abuse under the Online Safety Act. More than one in three women have experienced abuse online. The rise in intimate image abuse is not only devastating for victims but also spreads misogyny on social media that can develop into potentially dangerous relationships offline. One in 14 adults in England and Wales has experienced threats to share intimate images, rising to one in seven young women aged 18 to 34.
It is crucial that we tackle these crimes from every angle, including online, and ensure that tech companies step up and play their part. That is why we are laying this statutory instrument. Through it, we will widen online platforms’ and search engines’ obligations to tackle intimate image abuse under the Online Safety Act. As noble Lords will know, the Act received Royal Assent on 26 October 2023. It places strong new duties on online user-to-user platforms and search services to protect their users from harm.
As part of this, the Act gives service providers new “illegal content duties”. Under these duties, online platforms need to assess the risk that their services will allow users to encounter illegal content or be
“used for the commission or facilitation of a priority offence”.
They then need to take steps to mitigate identified risks. These will include implementing safety-by-design measures to reduce risks and content moderation systems to remove illegal content where it appears.
The Online Safety Act sets out a list of priority offences for the purposes of providers’ illegal content duties. These offences reflect the most serious and prevalent online illegal content and activity. They are set out in schedules to the Act. Platforms will need to take additional steps to tackle these kinds of illegal activities under their illegal content duties.
The priority offences list currently includes certain intimate image abuse offences. Through this statutory instrument, we are adding new intimate image abuse offences to the priority list. This replaces an old intimate image abuse offence, which has now been repealed. These new offences are in the Sexual Offences Act 2003. They took effect earlier this year. The older offence was in the Criminal Justice and Courts Act 2015. The repealed offence covered sharing intimate images where the intent was to cause distress. The new offences are broader; they criminalise sharing intimate images without having a reasonable belief that the subject would consent to sharing the images. These offences include the sharing of manufactured or manipulated images, including so-called deepfakes.
Since these new offences are more expansive, adding them as priority offences means online platforms will be required to tackle more intimate image abuse on their services. This means that we are broadening the scope of what constitutes illegal intimate image content in the Online Safety Act. It also makes it clear that platforms’ priority illegal content duties extend to AI-generated deepfakes and other manufactured intimate images. This is because the new offences that we are adding explicitly cover this content.
As I have set out above, these changes affect the illegal content duties in the Online Safety Act. They will ensure that tech companies play their part in kicking this content off social media. These are just part of a range of wider protections coming into force next spring through the Online Safety Act that will mean that social media companies have to remove the most harmful illegal content, a lot of which disproportionately affects women and girls, such as through harassment and controlling or coercive behaviour.
Ofcom will set out the specific steps that providers can take to fulfil their illegal content duties for intimate image abuse and other illegal content in codes of practice and guidance documentation. It is currently producing this documentation. We anticipate that the new duties will start to be enforced from spring next year once Ofcom has issued these codes of practice and they have come into force. Providers will also need to have done their risk assessment for illegal content by then. We anticipate that Ofcom will recommend that providers should take action in a number of areas. These include content moderation, reporting and complaints procedures, and safety-by-design steps, such as testing their algorithm systems to see whether illegal content is being recommended to users. We are committed to working with Ofcom to get these protections in place as quickly as possible. We are focused on delivering.
Where companies are not removing and proactively stopping this vile material appearing on their platforms, Ofcom will have robust powers to take enforcement action against them. This includes imposing fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is highest.
In conclusion, through this statutory instrument we are broadening providers’ duties for intimate image abuse content. Service providers will need to take proactive steps to search for, remove and limit people’s exposure to this harmful kind of illegal content, including where it has been manufactured or manipulated. I hope noble Lords will commend these further steps that we have taken that take the provisions in the Online Safety Act a useful further step forward. I commend these regulations to the Committee, and I beg to move.
My Lords, I thank the Minister for her introduction. I endorse everything she said about intimate image abuse and the importance of legislation to make sure that the perpetrators are penalised and that social media outlets have additional duties under Schedule 7 for priority offences. I am absolutely on the same page as the Minister on this, and I very much welcome what she said. It is interesting that we are dealing with another 2003 Act that, again, is showing itself fit for purpose and able to be amended; perhaps there is some cause to take comfort from our legislative process.
I was interested to hear what the Minister said about the coverage of the offences introduced by the Online Safety Act. She considered that the sharing of sexually explicit material included deepfakes. There was a promise—the noble Viscount will remember it—that the Criminal Justice Bill, which was not passed in the end, would cover that element. It included intent, like the current offence—the one that has been incorporated into Schedule 7. The Private Member’s Bill of the noble Baroness, Lady Owen—I have it in my hand—explicitly introduces an offence that does not require intent, and I very much support that.
I do not believe that this is the last word to be said on the kinds of IIA offence that need to be incorporated as priority offences under Schedule 7. I would very much like to hear what the noble Baroness has to say about why we require intent when, quite frankly, the creation of these deepfakes requires activity that is clearly harmful. We clearly should make sure that the perpetrators are caught. Given the history of this, I am slightly surprised that the Government’s current interpretation of the new offence in the Online Safety Act includes deepfakes. It is gratifying, but the Government nevertheless need to go further.
My Lords, I shall also start on a positive note and welcome the ongoing focus on online safety. We all aim to make this the safest country in the world in which to be online. The Online Safety Act is the cornerstone of how all of us will continue to pursue this crucial goal. The Act imposed clear legal responsibilities on social media platforms and tech companies, requiring them actively to monitor and manage the content they host. They are required swiftly to remove illegal content and to take proactive measures to prevent harmful material reaching minors. This reflects the deep commitment that we all share to safeguarding children from the dangers of cyberbullying, explicit content and other online threats.
We must also take particular account of the disproportionate harm that women and girls face online. The trends regarding the online abuse and exploitation that disproportionately affect female users are deeply concerning. Addressing these specific challenges is essential if we are to create a truly safe online environment for everyone.
With respect to the Government’s proposed approach to making sharing intimate images without consent a priority offence under the Online Safety Act, this initiative will require social media companies promptly to remove such content from their platforms. This aims to curb the rise in abuse that has been described as “intolerable”—I think rightly—by the Secretary of State. The intent behind this measure is to prevent generations becoming “desensitised” to the devastating effects of online abuse.
Although this appears to signal a strong stance against online harm, it raises the question of what this designation truly accomplishes in practical terms. I am grateful to the Minister for setting this out so clearly. I am not entirely sure that I altogether followed the differences between the old offences and the new ones. Sharing intimate images without consent is already illegal under current laws. Therefore, can we not say that the real issue lies in the absence not of legal provision but of effective enforcement of existing regulation? We have to ensure that any changes we make do not merely add layers of complexity but genuinely strengthen the protections available to victims and improve the responsiveness of platforms in removing harmful content.
With these thoughts in mind, I offer five questions. I apologise; the Minister is welcome to write as necessary, but I welcome her views whether now or in writing. First, why is it necessary to add the sharing of intimate images to the list of priority offences if such acts are already illegal under existing legislation and, specifically, what additional protections or outcomes are expected? The Minister gave some explanation of this, but I would welcome digging a little deeper into that.
Secondly, where consent is used as a defence against the charge of sharing intimate images, what are the Government’s thoughts on how to protect victims from intrusive cross-examination over details of their sexual history?
Thirdly, with respect to nudification technology, the previous Government argued that any photoreal image was covered by “intimate image abuse”—the noble Lord, Lord Clement-Jones, touched on this issue well. Is there any merit in looking at that again?
Fourthly, I am keen to hear the Government’s views on my noble friend Lady Owen’s Private Member’s Bill on nudification. We look forward to debating that in December.
Fifthly, and lastly, what role can or should parents and educators play in supporting the Act’s objectives? How will the Government engage these groups to promote online safety awareness?
My Lords, I thank noble Lords for their contributions to this debate. This is, as I think all noble Lords who have spoken recognise, a really important issue. It is important that we get this legislation right. We believe that updating the priority offences list with a new intimate image abuse offence is the correct, proportionate and evidence-led approach to tackle this type of content, and that it will provide stronger protections for online users. This update will bring us closer to achieving the commitment made in the Government’s manifesto to strengthening the protection for women and girls online.
I will try to cover all the questions asked. My noble friend Lord Stevenson and the noble Baroness, Lady Owen, asked whether we will review the Act and whether the Act is enough. Our immediate focus is on getting the Online Safety Act implemented quickly and effectively. It was designed to tackle illegal content and protect children; we want those protections in place as soon as possible. Having said that, it is right that the Government continually assess the law’s ability to keep up, especially when technology is moving so fast. We will of course look at how effective the protections are and build on the Online Safety Act, based on the evidence. However, our message to social media companies remains clear: “There is no need to wait. You can and should take immediate action to protect your users from these harms”.
The noble Baroness, Lady Owen, asked what further action we are taking against intimate abuse and about the taking, rather than sharing, of intimate images. We are committed to tackling the threat of violence against women and girls in all forms. We are considering what further legislative measures may be needed to strengthen the law on taking intimate images without consent and image abuse. This matter is very much on the Government’s agenda at the moment; I hope that we will be able to report some progress to the noble Baroness soon.
The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Owen, asked whether creating and making intimate image deepfakes will be an offence. The Government’s manifesto included a commitment to banning the creation of sexually explicit deepfakes. This is a priority for the Government. DSIT is working with the Home Office and the Ministry of Justice to identify the most appropriate legislative vehicle for ensuring that those who create these images without consent face the appropriate punishment. The Government are considering options in this space to protect women and girls from malicious uses of these technologies. The new sharing intimate images offence, which will be added to the OSA priority list through this SI, explicitly includes—for the first time—wholly synthetic manufactured images, such as deepfakes, so they will be tackled under the Online Safety Act.
The noble Baroness, Lady Owen, asked about the material that is already there and the ability to have a hash database to prevent those intimate images continually being circulated. We are aware that the technology exists. Strengthening the intimate image abuse priorities under the Act is a necessary first step to tackling this, but we expect Ofcom to consider this in its final draft illegal content codes and guidance and to give more information about both the codes of practice and the further measures that would need to be developed to address this issue.
Several noble Lords—the noble Viscount, Lord Camrose, the noble Lord, Lord Clement-Jones, and my noble friend Lord Stevenson—asked for more details on the new offences. As I tried to set out in my opening statement, the Online Safety Act repeals the offence of disclosing private sexual photographs and films with the intent to cause distress—this comes under Section 33 of the Criminal Justice and Courts Act 2015 and is commonly known as the revenge porn offence—and replaces it with four new offences.
First, there is a base offence of sharing an intimate image without consent, which carries a maximum penalty of six months’ imprisonment. Secondly, there are two specific-intent offences—the first is sharing an intimate image with intent to cause alarm, humiliation or distress; the second is sharing an intimate image for the purpose of obtaining sexual gratification—each of which carries a maximum penalty of two years’ imprisonment to reflect the more serious culpability of someone who acts without consent and with an additional malign intent. Lastly, there is an offence of threatening to share an intimate image, with a maximum penalty of two years’ imprisonment. This offence applies regardless of whether the image is shared.
These offences capture images that show, or appear to show, a person who is nude, partially nude, engaged in toileting or doing something sexual. These offences include the sharing of manufactured or manipulated images, which are referred to as deepfakes. This recognises that sharing intimate images without the consent of the person they show or appear to show is sufficiently wrongful or harmful to warrant criminalisation.
The noble Viscount, Lord Camrose, asked what is so different about these new offences compared to those in the Act. I stress that it is because they are being given priority status, which does not sound much but gives considerable extra powers under the Act. There will be new powers and new obligations on platforms. The key thing is that all those offences that already exist are being given priority status under the Online Safety Act. There are thousands of things that Ofcom could address, but this is now in the much smaller list of things that will place very specific obligations on the platforms. Ofcom will monitor this and, as I said earlier, companies can be fined huge sums of money if they do not act, so there is a huge obligation on them to follow through on the priority list.
I hope that I have answered all the questions and that noble Lords agree with me on the importance of updating the priority offences in the Online Safety Act. The noble Viscount, Lord Camrose, asked about parents and made an important point. This is not just about an Act, it is about everybody highlighting the fact that these activities are intolerable and offensive not just to the individuals concerned but to everybody in society, and parents have a responsibility, as we all do, to ensure that media literacy is at the height of the education we carry out formally in schools and informally within the home. The noble Viscount is absolutely right on that, and there is more that we could all do. I commend these regulations to the Committee.