All 2 Baroness Gohir contributions to the Data (Use and Access) Bill [HL] 2024-26

Read Bill Ministerial Extracts

Wed 18th Dec 2024
Tue 28th Jan 2025

Data (Use and Access) Bill [HL] Debate

Full Debate: Read Full Debate
Department: Department for Business and Trade

Data (Use and Access) Bill [HL]

Baroness Gohir Excerpts
Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- Hansard - - - Excerpts

My Lords, I rise today in support of Amendment 203 in the name of the noble Baroness, Lady Kidron. I declare an interest as a recent guest of Google at its Future Forum policy conference. I apologise for not being able to make Second Reading and for not being present for my last amendment; as a newer Peer, I am very new to this and still learning as I go. I am very grateful to the noble Baroness, Lady Kidron, for stepping in.

I commend the wording of the noble Baroness’s amendment, which tackles the full process of training these models, from the collection of data or images to use as training data, all the way through to possessing a model. With these apps easily downloadable on app stores, there is a lack of friction in the process. This means that we have seen horrific cases of children using these apps in schools across the world with devastating consequences. In summer, I met the father of a little girl who had been bullied in this way and sadly took her own life.

I am very grateful to the noble Baroness, Lady Kidron, for this thoughtful and comprehensive amendment, which seeks to future-proof with its inclusion of avatars. We have already seen these threats evolving in the metaverse. I encourage the Government to adopt this amendment so that we can begin to see an end to this abusive market.

I turn to my Amendment 211G. I am very grateful to the noble Lords, Lord Clement-Jones and Lord Browne of Ladyton, and the noble Baroness, Lady Kidron, for putting their names to it. Noble Lords may recognise it from my Private Member’s Bill on non-consensual sexually explicit images and videos. I will keep my remarks brief as many of your Lordships were present on Friday.

The amendment seeks to create offences for the non-consensual creation of sexually explicit content and to close the gaps in the Sexual Offences Act. It is, vitally, consent-based, meaning that victims do not have to suffer the trauma of proving the motivation of their perpetrator. It includes solicitation to prevent any creation laws being circumnavigated by asking those in other jurisdictions to create such content for you through the uploading of clothed images to forums. Finally, it includes forced deletion so that victims can clearly see their rights to have the content destroyed from any devices or cloud-based programmes and do not have to live in fear that their perpetrator is still in possession of their content.

This amendment is inspired by the lived experience of victim survivors. The Government have repeatedly said that they are looking for the most suitable legislative vehicle to fulfil their commitment to criminalise the creation of sexually explicit deepfakes. It seems they did not think my Private Member’s Bill was the right vehicle, but it is my firm belief that the most appropriate legislative vehicle is the one that gets there quickest. I am hopeful that the Government will be more receptive to an amendment to their legislation, given the need urgently to tackle this rapidly proliferating form of abuse.

Amendment 211H addresses the problem of sexually explicit audio, which the noble Baroness, Lady Gohir, spoke about so movingly in Friday’s debate. We have seen satirical voice cloning, such as of Gareth Southgate at the 2024 Euros. However, the most state-of-the-art systems now require around three seconds of voice audio data to create speech on a parity with a human. This could be data from a short phone call or a TikTok video. As we are reaching the point where less data is required to create high-quality audio, this now has the potential to be weaponised. There is a real risk that, if we do not future-proof against this while we have the opportunity, it could rapidly develop in the way that sexually explicit deepfake images have. We are already seeing signs of new sexually explicit audio online. Its ease of use combined with its accessibility could create a huge risk in future.

Henry Ajder, the researcher who pioneered the study of non-consensual deepfake image abuse, said:

“2024 has seen AI generated voice audio widely used in spreading political disinformation and new forms of fraud, but much less attention has been paid to its potential as a tool for digital sexual abuse”.


In his research in 2018, he observed several cases of online communities experimenting with voice-cloning capabilities, targeting celebrities to create non-consensual “synthetic phone sex” content. This Bill could be a key opportunity to future-proof against this problem before it becomes widespread.

Baroness Gohir Portrait Baroness Gohir (CB)
- Hansard - -

My Lords, I declare my interests as set out in the register, particularly as CEO of Muslim Women’s Network UK, which operates a national helpline. I also apologise for not being here at Second Reading, but I felt compelled to speak today after the noble Baroness, Lady Owen, put forward her amendments. Before I speak to them, I support all the amendments from the noble Baroness, Lady Kidron—everything she says is always very powerful.

The noble Baroness, Lady Owen, made her case powerfully today, as she did last week. I too spoke in that debate. We were disappointed across the House that the Government were not very supportive of the Bill, but they hinted that its amendments and recommendations could be integrated into another Bill. This Bill could be it.

I will focus my comments on audio recordings, which I raised last week. This element gets overlooked, because we tend to focus on sexually explicit images and video recordings. However, perpetrators will also record audio of sexual activities without consent and either share or threaten to share it. As the noble Baroness, Lady Owen, mentioned, people can create deepfakes very easily with new technologies. A person’s voice is recognisable to the people who know them, so this must be addressed and it can be in this Bill.

Perpetrators of intimate image and intimate audio abuse can instil fear, humiliate and make victims feel unsafe without even sharing, or threatening to share, it. They can manipulate and control their victims simply by making them aware that they have recorded or created these images and recordings.

The Muslim Women’s Network’s helpline has had women call to say that, when relationships have broken down, husbands and boyfriends have made secret audio recordings and then threatened them with those recordings. Sometimes, they have shared them online or with family members and friends. Just knowing that they possess these recordings makes these women feel very unsafe and live in fear. In some communities and cultures where people will be worried about honour-based abuse, women will be even more fearful of the repercussions of these audio recordings being shared.

Whether it is original audio or digitally created deepfake audio, the law needs to be amended to prevent this type of abuse. If the Labour Party and the Government are serious about halving abuse against women and girls, they must shut down every avenue of abuse and accept these amendments.

Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

My Lords, I will speak in support of Amendment 203, which I have signed, and Amendments 211G and 211H in my noble friend Lady Owen’s name.

At Second Reading, the mood of the House was to consider and support the enormous opportunity that comes from AI and to acknowledge the dangers of overregulation that might, somehow, smother this massive opportunity. I endorse that sentiment. However, Amendment 203 addresses computer-generated child sexual abuse material, which I regard as a red line that we should not cross. If we leave this amendment out of the Bill and cannot tackle this one massive issue of CSAM generated by AI, we will leave the whole question of the integrity and purpose of AI vulnerable to misuse by criminals and perverts.

The scale of the issue is already enormous. The Internet Watch Foundation found 275,000 webpages containing child sexual abuse content. On just one forum, 20,000 AI-generated images were posted in a single month, over 3,000 of which depicted criminal acts of child sexual abuse. This is not a hypothetical problem or some kind of visioneering or dystopian imagination; it is happening right now. There are offices filled with people generating this material for their pleasure and for commercial reasons. That is why it is urgent that we move immediately.

Any of us who has heard the testimony of the many victims of sexual abuse will realise that the experience creates lasting anxiety and gut-wrenching trauma. These are not just pictures or videos; they often represent real harm to real people. That is why urgency is so important and this amendment is so critical.

Shockingly, the explosion of this kind of material is enabled by publicly available tools, as the noble Baroness, Lady Kidron, pointed out. The case of Hugh Nelson is a very good example. He was sentenced to 18 years in prison for creating AI videos of children being physically and sexually abused. The tool he used was Daz 3D, AI software that any of us could access from this Room. It is inconceivable that this technology remains unregulated while being weaponised by people such as Hugh Nelson to inflict huge harm. Currently, our law focuses on the possession and distribution of CSAM but fails to address the mechanisms of its creation. That is a loophole and why I support these amendments. I do so for three key reasons.

First, Amendment 203 would criminalise the creation, training and distribution of AI models that can create CSAM. That would mean that Daz and other sites like it must introduce safety-by-design measures to stop their use for creating illegal content. That is not to smother the great and bountiful explosion of beneficial AI; it is to create the most basic guard-rail that should be embedded in any of these dangerous tools.

Data (Use and Access) Bill [HL]

Baroness Gohir Excerpts
Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I shall speak to Amendments 69, 70 and 72. I declare my interest as a guest of Google at its Future Forum and AI policy conference. I will also speak to government Amendments 56A, 74A and 77. I am grateful to the Government—particularly the Ministers, the noble Lord, Lord Ponsonby, and Sarah Sackman, who I know want to do the right thing by victim survivors—for taking the time to meet me and other noble Lords from across this House, and for the movement they have made in not pressing their own amendment.

I am so grateful for the offer to work together to put victim survivor experience at the heart of our legislation. As I have always advocated, a consent-based approach is the only approach that shows that the violation of a woman’s consent through the non-consensual creation of sexually explicit images and films is an act of abuse, regardless of a person’s motivation.

I am pleased that the Government have finally conceded that a woman’s consent is enough and, in doing so, will not press their amendments. I turn first to Amendment 69, in the names of the noble Lords, Lord Browne of Ladyton Lord Clement-Jones, and the noble Baroness, Lady Kidron. In doing so, I thank them for their steadfast and unwavering support.

I understand that the Government wish to bring forward their own amendment in time for Third Reading, I need to get absolute assurances from the Minister that it would be consent-based, as he has confirmed, cover solicitation, use the same definition of “an intimate state” as in the pre-existing sharing offence, that the limitation of time under the Magistrates’ Court Act will be taken as the date on which the victim becomes aware that the content has been created, and not the date on which it was created, and that it will include clarity under the law that the content used for image-based abuse will have clear guidance under Section 153 of the Sentencing Code.

If I cannot have absolute assurance from the noble Lord, I am motivated to test the opinion of the House, because a deepfake offence without the inclusion of solicitation will not be holistic. Amendment 69 vitally includes the solicitation of this content in order to close the gaps in the law and ensure that it cannot be circumnavigated by asking someone else in another jurisdiction where they have not yet legislated to create the content for you. It makes it an offence to solicit the content whether or not the creation happens. This vitally reflects the borderless nature of the internet and ensures that those in the UK who seek to abuse women by circumnavigating the proposed law will be held accountable.

Anyone who has had to witness their clothed images being touted on these sites dedicated to abuse will be subject to enormous fear and forced to live under the threat that the creation of sexually explicit content could happen at any moment. I would be grateful for the Minister’s absolute assurance that this will be part of the Government’s new amendment that they will bring at Third Reading and that it will be a consent-based solicitation offence. Without the inclusion of solicitation, we will be left with a gaping omission in our legislation.

My amendment uses the definition of “an intimate state” from Sections 66D(5), (6) and (7) of the Sexual Offences Act 2003 in order to have consistency with the pre-existing sharing laws. Unlike with the government amendment, victims will not have two separate definitions to contend with, depending on whether their image has been created or shared or both. I would just like a final reassurance from the Minister that this will be the definition.

My amendment clearly states in relation to Section 127(1) of the Magistrates’ Court Act 1980 on the limitation of time that the date on which the matter of complaint arose will be taken as the date on which the victim becomes aware of the content, as opposed to the date on which the perpetrator created the content. I need assurance from the Minister that their proposed amendment would do the same, so women are not inadvertently timed out of seeking justice. This issue was highlighted to me by campaigners at #NotYourPorn.

I turn now to Amendment 70 on the deletion of data used to perpetrate intimate image abuse. Following Committee, where I explained to the House that victims were being retraumatised by their abusers still being in possession of sexually explicit content of them following successful prosecution, I was very disheartened by the government response that no action was necessary due to Section 153 of the Sentencing Act 2020. I believe clarity under the pre-existing law is essential in order to avoid situations where victims are left traumatised and in a state of anxiety by their abusers keeping their intimate images.

However, I am very pleased that, following my amendment, the Government have had a change of heart. I understand that they are now willing to commit to amending the deprivation order powers of Section 153 of the Sentencing Code 2020 to ensure that courts can apply the orders to images and videos relating to the conviction of this offence, and any hardware. I would need the Minister’s assurance that it would also include physical copies and those held on any device, cloud-based programmes, digital messaging or social media platforms that the perpetrator controls. I would also like the commitment that this will be applied to the other pre- existing intimate image abuse offences, as my amendment did.

I turn to Amendment 72, which is in my name and those of the noble Baroness, Lady Gohir, and the noble Lord, Lord Clement-Jones. The noble Baroness, Lady Gohir, has previously highlighted to the House the growing problem of audio abuse. It is easy to envisage that, in only a short space of time, we could very realistically be in the same place on audio abuse as we are with sexually explicit deepfakes, as less data is required to create high-quality audio. This has the potential to be weaponised to yet again abuse women. We have the chance now to be proactive. I hope the Government, if they are not prepared to commit to it now, will take it seriously in their upcoming justice Bill.

As I have set out, I am extremely grateful for the Minister’s movement on these issues. I know that it is not straightforward to produce complex amendments at speed and I know the Minister is committed to getting the details right in this vital legislation. I expect the Government to provide an undertaking to bring amendments back at Third Reading to address this issue. Unless I receive reassurances that such amendments will address all the issues in the manner I have set out, I will test the opinion of the House. If I receive the reassurances that I am looking for today but, for any reason, the Government do not follow through with them at Third Reading, I reserve the right to bring back my own amendment covering all the elements I have raised on this important issue. I look forward to hearing from the Minister.

Baroness Gohir Portrait Baroness Gohir (CB)
- View Speech - Hansard - -

My Lords, I support everything the noble Baroness, Lady Owen, has said. I declare my interests as set out in the register. I will briefly speak on Amendment 72 about sexually explicit audio abuse, which I have raised a couple of times before.

I am concerned about why, now the Government know and are aware that sexually explicit audio abuse is a thing, they do not want to act now. We have victims right now. Perpetrators are making these recordings and using them to threaten and blackmail. They share these recordings to shame their victims and to maintain power and control. In some communities where shame and honour are a thing, those victims are then at risk of honour-based abuse. With new technologies, you can create deepfake audio as well.

It feels like the Government are kicking this into the long grass. I welcome the Minister’s comments that this will be considered, but there seems to be no timetable; it could be years before action is taken. I wonder whether the Government are waiting for there to be more noise on the issue and for more victims to come forward before they take action. Why not nip this in the bud now? The Minister mentioned the crime and policing Bill. It would be good to know why, for example, it cannot be included in that. I hope that we can shut down this avenue of abuse now and prevent there being more victims.

Lord Browne of Ladyton Portrait Lord Browne of Ladyton (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendments 69 and 70, to which I have added my name. I support the other amendments in the group, but I will leave others to speak to them because they own them. I do not think my noble friend the Minister wishes me to support his amendment, given what he has told us.

I take this opportunity to pay tribute to the noble Baroness, Lady Owen of Alderley Edge, whose campaigning on these issues has been a model of its kind. She has brought not only passion and commitment but astonishing forensic scrutiny to bear on them. She is to be commended for getting us to the place we are in today. I hope my noble friend the Minister will help her get to the destination she has set for us, which is the appropriate destination for this legislation.

The noble Baroness also brought me and others into contact with victims and survivors of this appalling sexual abuse and those who support them, which has been an extraordinary privilege too. Mostly young women, they are immensely impressive in the way they have worked together. Almost all of the many thousands of victims there have already been of this appalling abuse have been extraordinarily well represented.

I also thank those who have supported them. I will pick out Professor Clare McGlynn KC of Durham University and read part of the briefing paper that she produced for this occasion. I hope that all noble Lords who wish to participate in this debate have seen it. I know it has had a significant effect on people; I will not mention who they are, but I know that when they read it they were significantly affected by it.