1 Baroness Owen of Alderley Edge debates involving HM Treasury

International Women’s Day

Baroness Owen of Alderley Edge Excerpts
Friday 8th March 2024

(9 months, 2 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Owen of Alderley Edge Portrait Baroness Owen of Alderley Edge (Con)
- View Speech - Hansard - -

My Lords, I am delighted to participate in my first International Women’s Day debate in your Lordships’ House. I welcome the noble Baroness, Lady Casey, and look forward to her maiden speech.

I wish to focus my remarks today on an area I believe to be the new frontier for violence against women: the creation of deepfakes. A deepfake can be an image or video digitally altered to use someone’s face on another person’s body. They are often so convincing that it is incredibly difficult to tell that they are not real. Women are disproportionately affected by the creation of deepfake material. Research published by Home Security Heroes in 2023 found that 98% of all deepfake videos were pornographic, and of those, 99% were of women.

I have had the privilege over the last few months of meeting a number of campaigners and charities in this area, including the renowned Professor Clare McGlynn KC and Elena Michael from the campaign group Not Your Porn. All agree that the Online Safety Bill represented a vital first step in creating a safer online environment. However, I share their concerns that it has not gone far enough: only banning the nonconsensual sharing of deepfake material and not tackling the creation of the content itself.

I have previously spoken in your Lordships’ House, highlighting that the use of nudification apps and the creation of deepfake porn for private use is still legal. The largest site creating deepfakes receives an average of 13.4 million hits a month. The rapid proliferation of these nudification apps, 80% of them having launched in the last 12 months alone, has created an environment where anyone can perpetrate harm with ease—and it is not recognised as misogyny. This is backed up by research that found that 74% of deepfake pornography users do not feel guilty about their actions.

Your Lordships may be surprised to hear that, after long consultation, the Law Commission stated that, while it acknowledged that the making of intimate images was a violation of the subject’s sexual autonomy, it was less sure whether the level of harm caused by the making of these nonconsensual images and videos was serious enough to criminalise, and that any offence would prove difficult to enforce.

This report was published in July 2022. In 2023, more deepfake abuse videos were posted online than in every other year combined. Since International Women’s Day last year, the number of new pieces of content created each week has increased tenfold. I am sure many of your Lordships will agree that the creation of this material, in and of itself and without a person’s consent, causes serious harm, regardless of whether a person is aware of its creation, and has a much wider societal impact in the normalisation of online misogyny and hate.

We are now at the precipice of a new age of technology. It is vital that we act now to ensure that, in embarking on this brave new world, which can offer many exciting opportunities, we do not risk creating a technological gender gap that would further limit the economic inclusion of women in society.

The ability to create these images and videos using apps and platforms in a matter of seconds represents a very real threat to all women. A woman can no longer choose who owns an intimate image of her. They can be created by anyone, anywhere, at any time. The impact is often referred to as a silencing effect. Women may withdraw from social media and sometimes even from normal life. Many women are fearful of this happening to them.

As it stands, the current law prioritises the freedom of speech and expression of the creator over that of the woman. After a century of fighting for women’s rights to enter a space, we now run the risk of a new space being created where women fear to tread. While we are still learning about AI, it is crucial that we educate society to differentiate between what is real and what is not, in a world where we can no longer trust the images put in front of us. Time is of the essence. We must not miss the chance to act by legislating against the creation of nonconsensual deepfake content. We must prevent the normalisation of misogyny. Deepfake abuse is the new frontier of violence against women, and we must all take a stand.