Data (Use and Access) Bill [HL]

Baroness Owen of Alderley Edge Excerpts
It should not be possible for the Home Office to manage and for the MoJ to not manage. We need a Government where all departments work on behalf of all victims. I will wait to hear what the Minister says, and I very much hope I can congratulate her when I stand up again. I beg to move.
Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- Hansard - -

My Lords, I rise today in support of Amendment 203 in the name of the noble Baroness, Lady Kidron. I declare an interest as a recent guest of Google at its Future Forum policy conference. I apologise for not being able to make Second Reading and for not being present for my last amendment; as a newer Peer, I am very new to this and still learning as I go. I am very grateful to the noble Baroness, Lady Kidron, for stepping in.

I commend the wording of the noble Baroness’s amendment, which tackles the full process of training these models, from the collection of data or images to use as training data, all the way through to possessing a model. With these apps easily downloadable on app stores, there is a lack of friction in the process. This means that we have seen horrific cases of children using these apps in schools across the world with devastating consequences. In summer, I met the father of a little girl who had been bullied in this way and sadly took her own life.

I am very grateful to the noble Baroness, Lady Kidron, for this thoughtful and comprehensive amendment, which seeks to future-proof with its inclusion of avatars. We have already seen these threats evolving in the metaverse. I encourage the Government to adopt this amendment so that we can begin to see an end to this abusive market.

I turn to my Amendment 211G. I am very grateful to the noble Lords, Lord Clement-Jones and Lord Browne of Ladyton, and the noble Baroness, Lady Kidron, for putting their names to it. Noble Lords may recognise it from my Private Member’s Bill on non-consensual sexually explicit images and videos. I will keep my remarks brief as many of your Lordships were present on Friday.

The amendment seeks to create offences for the non-consensual creation of sexually explicit content and to close the gaps in the Sexual Offences Act. It is, vitally, consent-based, meaning that victims do not have to suffer the trauma of proving the motivation of their perpetrator. It includes solicitation to prevent any creation laws being circumnavigated by asking those in other jurisdictions to create such content for you through the uploading of clothed images to forums. Finally, it includes forced deletion so that victims can clearly see their rights to have the content destroyed from any devices or cloud-based programmes and do not have to live in fear that their perpetrator is still in possession of their content.

This amendment is inspired by the lived experience of victim survivors. The Government have repeatedly said that they are looking for the most suitable legislative vehicle to fulfil their commitment to criminalise the creation of sexually explicit deepfakes. It seems they did not think my Private Member’s Bill was the right vehicle, but it is my firm belief that the most appropriate legislative vehicle is the one that gets there quickest. I am hopeful that the Government will be more receptive to an amendment to their legislation, given the need urgently to tackle this rapidly proliferating form of abuse.

Amendment 211H addresses the problem of sexually explicit audio, which the noble Baroness, Lady Gohir, spoke about so movingly in Friday’s debate. We have seen satirical voice cloning, such as of Gareth Southgate at the 2024 Euros. However, the most state-of-the-art systems now require around three seconds of voice audio data to create speech on a parity with a human. This could be data from a short phone call or a TikTok video. As we are reaching the point where less data is required to create high-quality audio, this now has the potential to be weaponised. There is a real risk that, if we do not future-proof against this while we have the opportunity, it could rapidly develop in the way that sexually explicit deepfake images have. We are already seeing signs of new sexually explicit audio online. Its ease of use combined with its accessibility could create a huge risk in future.

Henry Ajder, the researcher who pioneered the study of non-consensual deepfake image abuse, said:

“2024 has seen AI generated voice audio widely used in spreading political disinformation and new forms of fraud, but much less attention has been paid to its potential as a tool for digital sexual abuse”.


In his research in 2018, he observed several cases of online communities experimenting with voice-cloning capabilities, targeting celebrities to create non-consensual “synthetic phone sex” content. This Bill could be a key opportunity to future-proof against this problem before it becomes widespread.

Baroness Gohir Portrait Baroness Gohir (CB)
- Hansard - - - Excerpts

My Lords, I declare my interests as set out in the register, particularly as CEO of Muslim Women’s Network UK, which operates a national helpline. I also apologise for not being here at Second Reading, but I felt compelled to speak today after the noble Baroness, Lady Owen, put forward her amendments. Before I speak to them, I support all the amendments from the noble Baroness, Lady Kidron—everything she says is always very powerful.

The noble Baroness, Lady Owen, made her case powerfully today, as she did last week. I too spoke in that debate. We were disappointed across the House that the Government were not very supportive of the Bill, but they hinted that its amendments and recommendations could be integrated into another Bill. This Bill could be it.

I will focus my comments on audio recordings, which I raised last week. This element gets overlooked, because we tend to focus on sexually explicit images and video recordings. However, perpetrators will also record audio of sexual activities without consent and either share or threaten to share it. As the noble Baroness, Lady Owen, mentioned, people can create deepfakes very easily with new technologies. A person’s voice is recognisable to the people who know them, so this must be addressed and it can be in this Bill.

Perpetrators of intimate image and intimate audio abuse can instil fear, humiliate and make victims feel unsafe without even sharing, or threatening to share, it. They can manipulate and control their victims simply by making them aware that they have recorded or created these images and recordings.

The Muslim Women’s Network’s helpline has had women call to say that, when relationships have broken down, husbands and boyfriends have made secret audio recordings and then threatened them with those recordings. Sometimes, they have shared them online or with family members and friends. Just knowing that they possess these recordings makes these women feel very unsafe and live in fear. In some communities and cultures where people will be worried about honour-based abuse, women will be even more fearful of the repercussions of these audio recordings being shared.

Whether it is original audio or digitally created deepfake audio, the law needs to be amended to prevent this type of abuse. If the Labour Party and the Government are serious about halving abuse against women and girls, they must shut down every avenue of abuse and accept these amendments.

Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2024

Baroness Owen of Alderley Edge Excerpts
Monday 28th October 2024

(1 month, 4 weeks ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for her introduction. I endorse everything she said about intimate image abuse and the importance of legislation to make sure that the perpetrators are penalised and that social media outlets have additional duties under Schedule 7 for priority offences. I am absolutely on the same page as the Minister on this, and I very much welcome what she said. It is interesting that we are dealing with another 2003 Act that, again, is showing itself fit for purpose and able to be amended; perhaps there is some cause to take comfort from our legislative process.

I was interested to hear what the Minister said about the coverage of the offences introduced by the Online Safety Act. She considered that the sharing of sexually explicit material included deepfakes. There was a promise—the noble Viscount will remember it—that the Criminal Justice Bill, which was not passed in the end, would cover that element. It included intent, like the current offence—the one that has been incorporated into Schedule 7. The Private Member’s Bill of the noble Baroness, Lady Owen—I have it in my hand—explicitly introduces an offence that does not require intent, and I very much support that.

I do not believe that this is the last word to be said on the kinds of IIA offence that need to be incorporated as priority offences under Schedule 7. I would very much like to hear what the noble Baroness has to say about why we require intent when, quite frankly, the creation of these deepfakes requires activity that is clearly harmful. We clearly should make sure that the perpetrators are caught. Given the history of this, I am slightly surprised that the Government’s current interpretation of the new offence in the Online Safety Act includes deepfakes. It is gratifying, but the Government nevertheless need to go further.

Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- Hansard - -

My Lords, I welcome the Minister’s remarks and the Government’s step to introduce this SI. I have concerns that it misses the wider problems. The powers given to Ofcom in the Online Safety Act require a lengthy process to implement and are not able to respond quickly. They also do not provide individuals with any redress. Therefore, this SI adding to the list of priority offences, while necessary, does not give victims the recourse they need.

My concern is that Ofcom is approaching this digital problem in an analogue way. It has the power to fine and even disrupt business but, in a digital space—where, when one website is blocked, another can open immediately—Ofcom would, in this scenario, have to restart its process all over again. These powers are not nimble or rapid enough, and they do not reflect the nature of the online space. They leave victims open and exposed to continuing distress. I would be grateful if the Government offered some assurances in this area.

The changes miss the wider problem of non-compliance by host websites outside the UK. As I have previously discussed in your Lordships’ House, the Revenge Porn Helpline has a removal rate of 90% of reported non-consensual sexually explicit content, both real and deepfake. However, in 10% of cases, the host website will not comply with the removal of the content. These sites are often hosted in countries such as Russia or those in Latin America. In cases of non-compliance by host websites, the victims continue to suffer, even where there has been a successful conviction.

If we take the example of a man who was convicted in the UK of blackmailing 200 women, the Revenge Porn Helpline successfully removed 161,000 images but 4,000 still remain online three years later, with platforms continuing to ignore the take-down requests. I would be grateful if the Government could outline how they are seeking to tackle the removal of this content, featuring British citizens, hosted in jurisdictions where host sites are not complying with removal.

Independent Pornography Review

Baroness Owen of Alderley Edge Excerpts
Monday 14th October 2024

(2 months, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

I thank my noble friend for her question. We are absolutely determined to keep children safe online and to use the Online Safety Act to provide protection across all the categories under its jurisdiction. Ofcom’s draft guidance lays out which technologies could constitute, for example, highly effective age assurance to protect children, and it will have a full range of enforcement powers to take action against companies that do not follow the duties, including substantial fines. I absolutely agree with my noble friend that robustness is key here. I think some people are frustrated that some of the duties in the Online Safety Act are taking time to be rolled out, but it was a feature of the Act that it would be done on that basis. We are very keen, as everybody in the House is, to see it enacted in full as soon as it can be.

Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- View Speech - Hansard - -

My Lords, the Revenge Porn Helpline has a removal rate of 90% of non-consensually shared intimate content, including deepfake. However, in 10% of cases, the host site will not comply with its removal, even where there has been a successful conviction. These sites are often hosted in Russia and Latin America, and are unlikely to come under Ofcom’s scope, even with the changes that make sharing a priority offence. Can the Minister inform the House what action the Government are taking to address non-compliance, and does she agree that it would be better adopt a rapid and wide-ranging approach—favoured by victims—to deem NCII content illegal, thus giving internet service providers the power to block it?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness for her continuing interest in this issue and her campaigning work. The Government have already put forward secondary legislation to ensure that the new intimate image abuse offence is made a priority under the Online Safety Act, and all other acts of deepfake portrayal will come under the Act if they are illegal. Going back to the earlier question about robustness, we absolutely expect Ofcom to implement those protections in a robust way.

Electronic Media: False Information

Baroness Owen of Alderley Edge Excerpts
Thursday 12th September 2024

(3 months, 2 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

My noble friend makes the important point that international co-operation is absolutely vital. We continue to talk to all our friends across the globe, exchanging information and making sure that best practice arises from those discussions.

Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- View Speech - Hansard - -

My Lords, research by Vodafone found that algorithms are pushing content to boys related to misogyny and violence following innocent and unrelated searches. Can the Minister say whether the Government are looking into how these algorithms have been used not only to push misinformation and disinformation but to push people towards and reinforce more extreme views?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, deepfakes and other forms of manipulated media are captured by the Online Safety Act where they constitute illegal content or harmful content to children in scope of the regulatory framework. Under the Act, all companies will be forced to take action against illegal content online, including illegal misinformation and disinformation, and they will be required to remove in-scope content. These duties will also apply to in-scope AI-generated content and AI-powered features.