Debates between Baroness Gohir and Baroness Owen of Alderley Edge during the 2024 Parliament

Wed 18th Dec 2024

Data (Use and Access) Bill [HL]

Debate between Baroness Gohir and Baroness Owen of Alderley Edge
Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- Hansard - - - Excerpts

My Lords, I rise today in support of Amendment 203 in the name of the noble Baroness, Lady Kidron. I declare an interest as a recent guest of Google at its Future Forum policy conference. I apologise for not being able to make Second Reading and for not being present for my last amendment; as a newer Peer, I am very new to this and still learning as I go. I am very grateful to the noble Baroness, Lady Kidron, for stepping in.

I commend the wording of the noble Baroness’s amendment, which tackles the full process of training these models, from the collection of data or images to use as training data, all the way through to possessing a model. With these apps easily downloadable on app stores, there is a lack of friction in the process. This means that we have seen horrific cases of children using these apps in schools across the world with devastating consequences. In summer, I met the father of a little girl who had been bullied in this way and sadly took her own life.

I am very grateful to the noble Baroness, Lady Kidron, for this thoughtful and comprehensive amendment, which seeks to future-proof with its inclusion of avatars. We have already seen these threats evolving in the metaverse. I encourage the Government to adopt this amendment so that we can begin to see an end to this abusive market.

I turn to my Amendment 211G. I am very grateful to the noble Lords, Lord Clement-Jones and Lord Browne of Ladyton, and the noble Baroness, Lady Kidron, for putting their names to it. Noble Lords may recognise it from my Private Member’s Bill on non-consensual sexually explicit images and videos. I will keep my remarks brief as many of your Lordships were present on Friday.

The amendment seeks to create offences for the non-consensual creation of sexually explicit content and to close the gaps in the Sexual Offences Act. It is, vitally, consent-based, meaning that victims do not have to suffer the trauma of proving the motivation of their perpetrator. It includes solicitation to prevent any creation laws being circumnavigated by asking those in other jurisdictions to create such content for you through the uploading of clothed images to forums. Finally, it includes forced deletion so that victims can clearly see their rights to have the content destroyed from any devices or cloud-based programmes and do not have to live in fear that their perpetrator is still in possession of their content.

This amendment is inspired by the lived experience of victim survivors. The Government have repeatedly said that they are looking for the most suitable legislative vehicle to fulfil their commitment to criminalise the creation of sexually explicit deepfakes. It seems they did not think my Private Member’s Bill was the right vehicle, but it is my firm belief that the most appropriate legislative vehicle is the one that gets there quickest. I am hopeful that the Government will be more receptive to an amendment to their legislation, given the need urgently to tackle this rapidly proliferating form of abuse.

Amendment 211H addresses the problem of sexually explicit audio, which the noble Baroness, Lady Gohir, spoke about so movingly in Friday’s debate. We have seen satirical voice cloning, such as of Gareth Southgate at the 2024 Euros. However, the most state-of-the-art systems now require around three seconds of voice audio data to create speech on a parity with a human. This could be data from a short phone call or a TikTok video. As we are reaching the point where less data is required to create high-quality audio, this now has the potential to be weaponised. There is a real risk that, if we do not future-proof against this while we have the opportunity, it could rapidly develop in the way that sexually explicit deepfake images have. We are already seeing signs of new sexually explicit audio online. Its ease of use combined with its accessibility could create a huge risk in future.

Henry Ajder, the researcher who pioneered the study of non-consensual deepfake image abuse, said:

“2024 has seen AI generated voice audio widely used in spreading political disinformation and new forms of fraud, but much less attention has been paid to its potential as a tool for digital sexual abuse”.


In his research in 2018, he observed several cases of online communities experimenting with voice-cloning capabilities, targeting celebrities to create non-consensual “synthetic phone sex” content. This Bill could be a key opportunity to future-proof against this problem before it becomes widespread.

Baroness Gohir Portrait Baroness Gohir (CB)
- Hansard - -

My Lords, I declare my interests as set out in the register, particularly as CEO of Muslim Women’s Network UK, which operates a national helpline. I also apologise for not being here at Second Reading, but I felt compelled to speak today after the noble Baroness, Lady Owen, put forward her amendments. Before I speak to them, I support all the amendments from the noble Baroness, Lady Kidron—everything she says is always very powerful.

The noble Baroness, Lady Owen, made her case powerfully today, as she did last week. I too spoke in that debate. We were disappointed across the House that the Government were not very supportive of the Bill, but they hinted that its amendments and recommendations could be integrated into another Bill. This Bill could be it.

I will focus my comments on audio recordings, which I raised last week. This element gets overlooked, because we tend to focus on sexually explicit images and video recordings. However, perpetrators will also record audio of sexual activities without consent and either share or threaten to share it. As the noble Baroness, Lady Owen, mentioned, people can create deepfakes very easily with new technologies. A person’s voice is recognisable to the people who know them, so this must be addressed and it can be in this Bill.

Perpetrators of intimate image and intimate audio abuse can instil fear, humiliate and make victims feel unsafe without even sharing, or threatening to share, it. They can manipulate and control their victims simply by making them aware that they have recorded or created these images and recordings.

The Muslim Women’s Network’s helpline has had women call to say that, when relationships have broken down, husbands and boyfriends have made secret audio recordings and then threatened them with those recordings. Sometimes, they have shared them online or with family members and friends. Just knowing that they possess these recordings makes these women feel very unsafe and live in fear. In some communities and cultures where people will be worried about honour-based abuse, women will be even more fearful of the repercussions of these audio recordings being shared.

Whether it is original audio or digitally created deepfake audio, the law needs to be amended to prevent this type of abuse. If the Labour Party and the Government are serious about halving abuse against women and girls, they must shut down every avenue of abuse and accept these amendments.