(1 week ago)
Grand CommitteeMy Lords, I declare my interests as set out in the register, particularly as CEO of Muslim Women’s Network UK, which operates a national helpline. I also apologise for not being here at Second Reading, but I felt compelled to speak today after the noble Baroness, Lady Owen, put forward her amendments. Before I speak to them, I support all the amendments from the noble Baroness, Lady Kidron—everything she says is always very powerful.
The noble Baroness, Lady Owen, made her case powerfully today, as she did last week. I too spoke in that debate. We were disappointed across the House that the Government were not very supportive of the Bill, but they hinted that its amendments and recommendations could be integrated into another Bill. This Bill could be it.
I will focus my comments on audio recordings, which I raised last week. This element gets overlooked, because we tend to focus on sexually explicit images and video recordings. However, perpetrators will also record audio of sexual activities without consent and either share or threaten to share it. As the noble Baroness, Lady Owen, mentioned, people can create deepfakes very easily with new technologies. A person’s voice is recognisable to the people who know them, so this must be addressed and it can be in this Bill.
Perpetrators of intimate image and intimate audio abuse can instil fear, humiliate and make victims feel unsafe without even sharing, or threatening to share, it. They can manipulate and control their victims simply by making them aware that they have recorded or created these images and recordings.
The Muslim Women’s Network’s helpline has had women call to say that, when relationships have broken down, husbands and boyfriends have made secret audio recordings and then threatened them with those recordings. Sometimes, they have shared them online or with family members and friends. Just knowing that they possess these recordings makes these women feel very unsafe and live in fear. In some communities and cultures where people will be worried about honour-based abuse, women will be even more fearful of the repercussions of these audio recordings being shared.
Whether it is original audio or digitally created deepfake audio, the law needs to be amended to prevent this type of abuse. If the Labour Party and the Government are serious about halving abuse against women and girls, they must shut down every avenue of abuse and accept these amendments.
My Lords, I will speak in support of Amendment 203, which I have signed, and Amendments 211G and 211H in my noble friend Lady Owen’s name.
At Second Reading, the mood of the House was to consider and support the enormous opportunity that comes from AI and to acknowledge the dangers of overregulation that might, somehow, smother this massive opportunity. I endorse that sentiment. However, Amendment 203 addresses computer-generated child sexual abuse material, which I regard as a red line that we should not cross. If we leave this amendment out of the Bill and cannot tackle this one massive issue of CSAM generated by AI, we will leave the whole question of the integrity and purpose of AI vulnerable to misuse by criminals and perverts.
The scale of the issue is already enormous. The Internet Watch Foundation found 275,000 webpages containing child sexual abuse content. On just one forum, 20,000 AI-generated images were posted in a single month, over 3,000 of which depicted criminal acts of child sexual abuse. This is not a hypothetical problem or some kind of visioneering or dystopian imagination; it is happening right now. There are offices filled with people generating this material for their pleasure and for commercial reasons. That is why it is urgent that we move immediately.
Any of us who has heard the testimony of the many victims of sexual abuse will realise that the experience creates lasting anxiety and gut-wrenching trauma. These are not just pictures or videos; they often represent real harm to real people. That is why urgency is so important and this amendment is so critical.
Shockingly, the explosion of this kind of material is enabled by publicly available tools, as the noble Baroness, Lady Kidron, pointed out. The case of Hugh Nelson is a very good example. He was sentenced to 18 years in prison for creating AI videos of children being physically and sexually abused. The tool he used was Daz 3D, AI software that any of us could access from this Room. It is inconceivable that this technology remains unregulated while being weaponised by people such as Hugh Nelson to inflict huge harm. Currently, our law focuses on the possession and distribution of CSAM but fails to address the mechanisms of its creation. That is a loophole and why I support these amendments. I do so for three key reasons.
First, Amendment 203 would criminalise the creation, training and distribution of AI models that can create CSAM. That would mean that Daz and other sites like it must introduce safety-by-design measures to stop their use for creating illegal content. That is not to smother the great and bountiful explosion of beneficial AI; it is to create the most basic guard-rail that should be embedded in any of these dangerous tools.