Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2024 Debate

Full Debate: Read Full Debate
Department: Department for Business and Trade

Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2024

Lord Clement-Jones Excerpts
Monday 28th October 2024

(4 weeks ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, these regulations were laid before the House on 12 September this year. The Government stated in their manifesto that they would

“use every government tool available to target perpetrators and address the root causes of abuse and violence”

in order to achieve their

“landmark mission to halve violence against women and girls in a decade”.

Through this statutory instrument, we are broadening online platforms’ and search engines’ responsibilities for tackling intimate image abuse under the Online Safety Act. More than one in three women have experienced abuse online. The rise in intimate image abuse is not only devastating for victims but also spreads misogyny on social media that can develop into potentially dangerous relationships offline. One in 14 adults in England and Wales has experienced threats to share intimate images, rising to one in seven young women aged 18 to 34.

It is crucial that we tackle these crimes from every angle, including online, and ensure that tech companies step up and play their part. That is why we are laying this statutory instrument. Through it, we will widen online platforms’ and search engines’ obligations to tackle intimate image abuse under the Online Safety Act. As noble Lords will know, the Act received Royal Assent on 26 October 2023. It places strong new duties on online user-to-user platforms and search services to protect their users from harm.

As part of this, the Act gives service providers new “illegal content duties”. Under these duties, online platforms need to assess the risk that their services will allow users to encounter illegal content or be

“used for the commission or facilitation of a priority offence”.

They then need to take steps to mitigate identified risks. These will include implementing safety-by-design measures to reduce risks and content moderation systems to remove illegal content where it appears.

The Online Safety Act sets out a list of priority offences for the purposes of providers’ illegal content duties. These offences reflect the most serious and prevalent online illegal content and activity. They are set out in schedules to the Act. Platforms will need to take additional steps to tackle these kinds of illegal activities under their illegal content duties.

The priority offences list currently includes certain intimate image abuse offences. Through this statutory instrument, we are adding new intimate image abuse offences to the priority list. This replaces an old intimate image abuse offence, which has now been repealed. These new offences are in the Sexual Offences Act 2003. They took effect earlier this year. The older offence was in the Criminal Justice and Courts Act 2015. The repealed offence covered sharing intimate images where the intent was to cause distress. The new offences are broader; they criminalise sharing intimate images without having a reasonable belief that the subject would consent to sharing the images. These offences include the sharing of manufactured or manipulated images, including so-called deepfakes.

Since these new offences are more expansive, adding them as priority offences means online platforms will be required to tackle more intimate image abuse on their services. This means that we are broadening the scope of what constitutes illegal intimate image content in the Online Safety Act. It also makes it clear that platforms’ priority illegal content duties extend to AI-generated deepfakes and other manufactured intimate images. This is because the new offences that we are adding explicitly cover this content.

As I have set out above, these changes affect the illegal content duties in the Online Safety Act. They will ensure that tech companies play their part in kicking this content off social media. These are just part of a range of wider protections coming into force next spring through the Online Safety Act that will mean that social media companies have to remove the most harmful illegal content, a lot of which disproportionately affects women and girls, such as through harassment and controlling or coercive behaviour.

Ofcom will set out the specific steps that providers can take to fulfil their illegal content duties for intimate image abuse and other illegal content in codes of practice and guidance documentation. It is currently producing this documentation. We anticipate that the new duties will start to be enforced from spring next year once Ofcom has issued these codes of practice and they have come into force. Providers will also need to have done their risk assessment for illegal content by then. We anticipate that Ofcom will recommend that providers should take action in a number of areas. These include content moderation, reporting and complaints procedures, and safety-by-design steps, such as testing their algorithm systems to see whether illegal content is being recommended to users. We are committed to working with Ofcom to get these protections in place as quickly as possible. We are focused on delivering.

Where companies are not removing and proactively stopping this vile material appearing on their platforms, Ofcom will have robust powers to take enforcement action against them. This includes imposing fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is highest.

In conclusion, through this statutory instrument we are broadening providers’ duties for intimate image abuse content. Service providers will need to take proactive steps to search for, remove and limit people’s exposure to this harmful kind of illegal content, including where it has been manufactured or manipulated. I hope noble Lords will commend these further steps that we have taken that take the provisions in the Online Safety Act a useful further step forward. I commend these regulations to the Committee, and I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for her introduction. I endorse everything she said about intimate image abuse and the importance of legislation to make sure that the perpetrators are penalised and that social media outlets have additional duties under Schedule 7 for priority offences. I am absolutely on the same page as the Minister on this, and I very much welcome what she said. It is interesting that we are dealing with another 2003 Act that, again, is showing itself fit for purpose and able to be amended; perhaps there is some cause to take comfort from our legislative process.

I was interested to hear what the Minister said about the coverage of the offences introduced by the Online Safety Act. She considered that the sharing of sexually explicit material included deepfakes. There was a promise—the noble Viscount will remember it—that the Criminal Justice Bill, which was not passed in the end, would cover that element. It included intent, like the current offence—the one that has been incorporated into Schedule 7. The Private Member’s Bill of the noble Baroness, Lady Owen—I have it in my hand—explicitly introduces an offence that does not require intent, and I very much support that.

I do not believe that this is the last word to be said on the kinds of IIA offence that need to be incorporated as priority offences under Schedule 7. I would very much like to hear what the noble Baroness has to say about why we require intent when, quite frankly, the creation of these deepfakes requires activity that is clearly harmful. We clearly should make sure that the perpetrators are caught. Given the history of this, I am slightly surprised that the Government’s current interpretation of the new offence in the Online Safety Act includes deepfakes. It is gratifying, but the Government nevertheless need to go further.

Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- Hansard - - - Excerpts

My Lords, I welcome the Minister’s remarks and the Government’s step to introduce this SI. I have concerns that it misses the wider problems. The powers given to Ofcom in the Online Safety Act require a lengthy process to implement and are not able to respond quickly. They also do not provide individuals with any redress. Therefore, this SI adding to the list of priority offences, while necessary, does not give victims the recourse they need.

My concern is that Ofcom is approaching this digital problem in an analogue way. It has the power to fine and even disrupt business but, in a digital space—where, when one website is blocked, another can open immediately—Ofcom would, in this scenario, have to restart its process all over again. These powers are not nimble or rapid enough, and they do not reflect the nature of the online space. They leave victims open and exposed to continuing distress. I would be grateful if the Government offered some assurances in this area.

The changes miss the wider problem of non-compliance by host websites outside the UK. As I have previously discussed in your Lordships’ House, the Revenge Porn Helpline has a removal rate of 90% of reported non-consensual sexually explicit content, both real and deepfake. However, in 10% of cases, the host website will not comply with the removal of the content. These sites are often hosted in countries such as Russia or those in Latin America. In cases of non-compliance by host websites, the victims continue to suffer, even where there has been a successful conviction.

If we take the example of a man who was convicted in the UK of blackmailing 200 women, the Revenge Porn Helpline successfully removed 161,000 images but 4,000 still remain online three years later, with platforms continuing to ignore the take-down requests. I would be grateful if the Government could outline how they are seeking to tackle the removal of this content, featuring British citizens, hosted in jurisdictions where host sites are not complying with removal.