(2 weeks, 4 days ago)
Lords ChamberMy Lords, the Government recognise the destructive role that misogynistic attitudes, including online misogynistic content, can play in society, including the impact it can have on the views and behaviours of men and boys. Tackling misogyny both online and offline is central to our mission to halve violence against women and girls in a decade, supporting victims and preventing harm in our communities. We will publish a new violence against women strategy this year. The Government will ensure that schools address the root causes of violence against women and girls, and teach pupils about healthy relationships and consent, and will continue to ensure children and young people are at the heart of prevention and intervention programmes and policies.
My Lords, one of the key themes in “Adolescence” was intimate image abuse. Just this week, the Government have rejected the Women and Equalities Committee recommendation to increase from six months the time limit for victims to seek justice when their intimate images have been non-consensually shared. Will the Minister explain the Government’s reasoning for rejecting a change that would help so many victims?
My Lords, the Government welcome the Women and Equalities Committee report on tackling non-consensual intimate image abuse, and the issues it raises are an absolute priority for us. That is why we have taken action by strengthening the Online Safety Act and introducing further offences as part of the Crime and Policing Bill and the Data (Use and Access) Bill—and I pay tribute to the noble Baroness for all the work she has done in helping to us to strengthen that legislation. We will not hesitate to go further to protect women and girls online. Technology-facilitated abuse will be a key component of the upcoming cross-government violence against women and girls strategy.
(3 weeks, 6 days ago)
Lords ChamberMy Lords, with the leave of the House, I will also speak to Amendments 54 to 74 and 79.
We all agree that tackling the abuse of intimate image deepfakes is incredibly important. I am delighted that these provisions are returning to this House, having been strengthened in the other place, enabling us once again to discuss this key issue. I extend my heartfelt thanks to the noble Baroness, Lady Owen, for her dedication on this issue. I am also grateful to the noble Lords, Lord Pannick—who unfortunately is not in his place—and Lord Clement-Jones, and others who have generously given much of their time to discussing this issue with me. Their engagement with me and my ministerial colleagues has been instrumental as we have refined our approach to this important topic. It has been a fantastic example of parliamentarians working across the House to get policy in the strongest possible position.
At Third Reading I committed that the Government would bring forward further amendments in the Commons, including on solicitation and time limits. We have delivered on those commitments. I will begin with Commons Amendment 56, which introduces the requesting offence. This addresses the commitment made on solicitation. It replaces, but builds on and delivers the same intent as, the amendment that your Lordships made to the Bill. It comprehensively criminalises asking someone to create a deepfake intimate image for you without the consent of the person in the image or the reasonable belief in their consent. This is an offence regardless of where the person you are asking is based or whether the image was in fact created.
I turn to the commitment on time limits. Commons Amendment 63 was passed to extend the statutory time limit so that prosecutions can be brought at any date that is both within six months of when sufficient evidence comes to the prosecutor’s knowledge and within three years of when the offence was committed. This means that perpetrators will not get away with creating or requesting the creation of a deepfake just because no one knew about it at the time.
A further change was made in the Commons through Commons Amendment 55, to add a defence of reasonable excuse to both the creating and requesting offences. I know that this is likely to be the subject of much debate today, so I will spend some time setting out the Government’s position.
First, I want to reassure the House that the Government’s priority is to create comprehensive, robust defences which ensure that perpetrators cannot evade justice. It is not our intention that the defences provide defendants with a get-out clause, and we do not believe that they do so. This is especially important to stress for the creation of sexual deepfakes, which are so extraordinarily harmful. In our view, it is extremely unlikely that there will ever be a situation where someone creating a sexually explicit deepfake will be able to prove that they had a reasonable excuse. Indeed, we anticipate that the defences would apply only in an extremely narrow set of circumstances, such as for covert law enforcement operations.
It is also our view that, for a very small minority of cases, such as the creation of genuinely satirical images that are not sexually explicit, the defence to the creating offence is legally necessary for it to be compatible with Article 10 of the European Convention on Human Rights. Without the “reasonable excuse” defence, we consider that the creating offence will not be legally robust, and that any legal challenge to its compatibility with Article 10 is likely to be successful. This will not provide the best protection for the victims. Let me labour this very important point: our intention is to create comprehensive, robust offences that will ensure that those who create or request intimate deepfake images without consent, particularly sexual deepfake images, face grave consequences.
I also want to stress that abusers will not be able to evade justice by using spurious excuses. The defendant must provide enough evidence to prove that the creation, or that particular request, without consent was reasonable. They cannot just say it is art or satire without sufficient compelling evidence. It will be for the court, not the defendant, to decide whether something is in fact art or satire. From my many years as a magistrate, I can also reassure the House that it is simply not the case that a defendant can offer up any excuse and assert that it is reasonable. The CPS will challenge spurious arguments, and the courts are extremely well equipped and used to dealing with such arguments quickly.
The Government share the House’s desire to ensure that criminal law, and these defences in particular, work as well as the Government intend. I therefore speak to support the noble Baroness’s Amendments 55E and 56B, which place a binding obligation on the Government to review the operation of the “reasonable excuse” defence, for both the creating and requesting offences, by putting it in the Bill. As part of this review, we will carry out targeted engagement with external stakeholders and subject matter experts to ensure that we make a broad and informed assessment of the defence.
I hope this addresses the concerns about these defences. The best way to protect victims is to ensure that Parliament passes legally sound and robust offences that can bring perpetrators to justice. I urge the House to do that by supporting Motion 55C and Amendment 56B. I beg to move.
My Lords, I speak to my amendments in this group. In doing so, I declare my interest as a guest of Google at its AI policy conference.
I start by thanking both the Minister and Minister Davies-Jones for taking the time to engage on this issue and for their endless patience. I know they have worked incredibly hard to secure progress on this and I am very grateful for their efforts.
We are down to the issue of whether we believe a person can have a reasonable excuse to create content that looks like a photograph or film of another person without their consent. Noble Lords will recall that this House overwhelmingly indicated that we did not believe “reasonable excuse” should be included as a defence and highlighted concern that it may be misinterpreted or viewed too widely.
I have concerns over the position the Government outlined in their letter from Minister Bryant to the Joint Committee on Human Rights. Minister Bryant argues that the inclusion of “reasonable excuse” is necessary as, without it, the offence would breach the ECHR due to limiting a person’s freedom to create photorealistic satirical art of scenarios such as a person on the toilet or in boxer shorts. Additionally, the Government argued the need for tech companies to be able to red team against this offence.
I share the Government’s strong desire that we do not want this Bill to have a memorandum on it warning that it may breach the ECHR, however precarious the arguments laid out may be. I do not want those who abuse women in this way to claim the prosecution may contravene their human rights.
With this in mind, I turn to my first amendments, Amendments 55C and 56B, written in conjunction with the Government, which offer a review of the implementation of “reasonable excuse” for both the creation and requesting offences after two years. I am grateful to the Minister for the compromise. He will know the conflicts I feel about this issue and the great concern I have that, without guardrails, “reasonable excuse” may be used to allow those who abuse others in this sickening way to escape justice.
I know the Minister will offer me reassurance that the courts will be used to hearing precarious excuses. However, my concern—as noble Lords know—is that image-based sexual abuse has been consistently misunderstood, with the Law Commission itself only arguing three years ago that the harm from creating non-consensual sexually explicit content was not serious enough to criminalise. In 2023, Refuge found that, despite steady year-on-year increases in recorded offences for image-based abuse, only 4% of offenders were charged. Even when a conviction was achieved, only 3% of cases resulted in the perpetrator being deprived of the images used for the offence.
We have seen consistent failure by prosecutors to understand and tackle the issue. I therefore have a very real concern that, by allowing “reasonable excuse” to sit in this offence, we risk it being misunderstood and the offence being undermined. Further, while I am grateful for the offer of a review, I am worried that if after two years we find “reasonable excuse” is allowing perpetrators to evade justice, there will not be a legislative vehicle in which to correct the issue, and the time it takes to correct may be lengthy. I would be grateful if the Minister could offer me reassurance on this point.
Additionally, I am concerned by the very premise of the argument that legislation without “reasonable excuse” would breach the ECHR. I have sought the legal counsel of the noble Lord, Lord Pannick, KC—who apologises for not being here this evening—and he believes that the inclusion of “reasonable excuse” in the defence is not necessary in order to be compliant with the ECHR.
The noble Lord, Lord Pannick, advised, as the Joint Committee on Human Rights already highlighted in its letter, that
“the Government has stated that prosecutorial discretion is sufficient to ensure that an offence that could violate a qualified right under the ECHR is nevertheless compliant with it”.
Additionally, all legislation must, so far as possible, be read and given effect to in a manner that is compliant with the ECHR, according to Section 3 of the Human Rights Act 1998. So, even if there were to be a prosecution in the sort of circumstances contemplated by the Government, the defendant could rely on their Article 10 rights, which means that an all-encompassing reasonable excuse is not necessary.
Additionally, I would be grateful if the Minister could outline to the House the reasons why tech companies cannot red team by prompting with the images of people who do consent and, therefore, not requiring a reasonable excuse, should their model fail and end up creating the content that it is trying to avoid. I would go as far as to say that testing prompts on a model using the image of a person who does not consent would be deeply unethical. It is my belief—and the view of the noble Lord, Lord Pannick, and the noble Baroness, Lady Chakrabarti—that such specific examples do not justify general reasonable excuse. To quote my friend and human rights advocate, the noble Baroness, Lady Chakrabarti:
“Spurious ECHR arguments for weakening 21st century cyber sex offences do not help the cause of those seeking to defend human rights from its many detractors”.
55C: Leave out from “House” to end and insert “do disagree with the Commons in their Amendment 55, and do propose Amendments 55D and 55E in lieu—
(1 month, 1 week ago)
Lords ChamberMy Lords, through the Crime and Policing Bill, the Government will introduce a new suite of measures to tackle the growing threat of AI. This includes criminalising AI models made or adapted to generate child sexual abuse imagery and extending the existing paedophile manuals offence to cover AI-generated child sexual abuse material. In addition, the Home Office will bolster the network of undercover online police officers to target online offenders and develop cutting-edge AI tools and other new capabilities to infiltrate live streams and chat rooms where children are groomed. The Home Office is developing options at pace on potential device operating system-level safety controls to prevent online exploitation and abuse of children. It is also vital that we tackle the widespread sharing of self-generated indecent imagery. The report shows that 91% of the images are self-generated. This is young people who are being groomed and often quite innocently sharing their material, not realising the purpose for which it will be used. This is a huge and pressing issue, and my noble friend quite rightly raises that we need to take action now to tackle this scourge.
My Lords, it is clear that, with the constant evolution of technology, we risk not being able to legislate rapidly enough to keep pace. How are the Government conducting their horizon scanning to ensure that we are always one step ahead of those who seek to abuse children in this way?
The noble Baroness is quite right that we have to keep the technology up to date, and of course we are endeavouring to do that. I should say that UK law applies to AI-generated CSAM in the same way as to real child sexual abuse. Creating, possessing or distributing any child sex abuse images, including those generated by AI, is illegal. Generative AI child sexual abuse imagery is priority illegal content under the Online Safety Act in the same way as real content. However, she is quite right: we have to keep abreast of the technology. We are working at pace across government to make sure that we have the capacity to do that.
(5 months, 3 weeks ago)
Grand CommitteeMy Lords, I rise today in support of Amendment 203 in the name of the noble Baroness, Lady Kidron. I declare an interest as a recent guest of Google at its Future Forum policy conference. I apologise for not being able to make Second Reading and for not being present for my last amendment; as a newer Peer, I am very new to this and still learning as I go. I am very grateful to the noble Baroness, Lady Kidron, for stepping in.
I commend the wording of the noble Baroness’s amendment, which tackles the full process of training these models, from the collection of data or images to use as training data, all the way through to possessing a model. With these apps easily downloadable on app stores, there is a lack of friction in the process. This means that we have seen horrific cases of children using these apps in schools across the world with devastating consequences. In summer, I met the father of a little girl who had been bullied in this way and sadly took her own life.
I am very grateful to the noble Baroness, Lady Kidron, for this thoughtful and comprehensive amendment, which seeks to future-proof with its inclusion of avatars. We have already seen these threats evolving in the metaverse. I encourage the Government to adopt this amendment so that we can begin to see an end to this abusive market.
I turn to my Amendment 211G. I am very grateful to the noble Lords, Lord Clement-Jones and Lord Browne of Ladyton, and the noble Baroness, Lady Kidron, for putting their names to it. Noble Lords may recognise it from my Private Member’s Bill on non-consensual sexually explicit images and videos. I will keep my remarks brief as many of your Lordships were present on Friday.
The amendment seeks to create offences for the non-consensual creation of sexually explicit content and to close the gaps in the Sexual Offences Act. It is, vitally, consent-based, meaning that victims do not have to suffer the trauma of proving the motivation of their perpetrator. It includes solicitation to prevent any creation laws being circumnavigated by asking those in other jurisdictions to create such content for you through the uploading of clothed images to forums. Finally, it includes forced deletion so that victims can clearly see their rights to have the content destroyed from any devices or cloud-based programmes and do not have to live in fear that their perpetrator is still in possession of their content.
This amendment is inspired by the lived experience of victim survivors. The Government have repeatedly said that they are looking for the most suitable legislative vehicle to fulfil their commitment to criminalise the creation of sexually explicit deepfakes. It seems they did not think my Private Member’s Bill was the right vehicle, but it is my firm belief that the most appropriate legislative vehicle is the one that gets there quickest. I am hopeful that the Government will be more receptive to an amendment to their legislation, given the need urgently to tackle this rapidly proliferating form of abuse.
Amendment 211H addresses the problem of sexually explicit audio, which the noble Baroness, Lady Gohir, spoke about so movingly in Friday’s debate. We have seen satirical voice cloning, such as of Gareth Southgate at the 2024 Euros. However, the most state-of-the-art systems now require around three seconds of voice audio data to create speech on a parity with a human. This could be data from a short phone call or a TikTok video. As we are reaching the point where less data is required to create high-quality audio, this now has the potential to be weaponised. There is a real risk that, if we do not future-proof against this while we have the opportunity, it could rapidly develop in the way that sexually explicit deepfake images have. We are already seeing signs of new sexually explicit audio online. Its ease of use combined with its accessibility could create a huge risk in future.
Henry Ajder, the researcher who pioneered the study of non-consensual deepfake image abuse, said:
“2024 has seen AI generated voice audio widely used in spreading political disinformation and new forms of fraud, but much less attention has been paid to its potential as a tool for digital sexual abuse”.
In his research in 2018, he observed several cases of online communities experimenting with voice-cloning capabilities, targeting celebrities to create non-consensual “synthetic phone sex” content. This Bill could be a key opportunity to future-proof against this problem before it becomes widespread.
My Lords, I declare my interests as set out in the register, particularly as CEO of Muslim Women’s Network UK, which operates a national helpline. I also apologise for not being here at Second Reading, but I felt compelled to speak today after the noble Baroness, Lady Owen, put forward her amendments. Before I speak to them, I support all the amendments from the noble Baroness, Lady Kidron—everything she says is always very powerful.
The noble Baroness, Lady Owen, made her case powerfully today, as she did last week. I too spoke in that debate. We were disappointed across the House that the Government were not very supportive of the Bill, but they hinted that its amendments and recommendations could be integrated into another Bill. This Bill could be it.
I will focus my comments on audio recordings, which I raised last week. This element gets overlooked, because we tend to focus on sexually explicit images and video recordings. However, perpetrators will also record audio of sexual activities without consent and either share or threaten to share it. As the noble Baroness, Lady Owen, mentioned, people can create deepfakes very easily with new technologies. A person’s voice is recognisable to the people who know them, so this must be addressed and it can be in this Bill.
Perpetrators of intimate image and intimate audio abuse can instil fear, humiliate and make victims feel unsafe without even sharing, or threatening to share, it. They can manipulate and control their victims simply by making them aware that they have recorded or created these images and recordings.
The Muslim Women’s Network’s helpline has had women call to say that, when relationships have broken down, husbands and boyfriends have made secret audio recordings and then threatened them with those recordings. Sometimes, they have shared them online or with family members and friends. Just knowing that they possess these recordings makes these women feel very unsafe and live in fear. In some communities and cultures where people will be worried about honour-based abuse, women will be even more fearful of the repercussions of these audio recordings being shared.
Whether it is original audio or digitally created deepfake audio, the law needs to be amended to prevent this type of abuse. If the Labour Party and the Government are serious about halving abuse against women and girls, they must shut down every avenue of abuse and accept these amendments.
(7 months, 1 week ago)
Grand CommitteeMy Lords, I thank the Minister for her introduction. I endorse everything she said about intimate image abuse and the importance of legislation to make sure that the perpetrators are penalised and that social media outlets have additional duties under Schedule 7 for priority offences. I am absolutely on the same page as the Minister on this, and I very much welcome what she said. It is interesting that we are dealing with another 2003 Act that, again, is showing itself fit for purpose and able to be amended; perhaps there is some cause to take comfort from our legislative process.
I was interested to hear what the Minister said about the coverage of the offences introduced by the Online Safety Act. She considered that the sharing of sexually explicit material included deepfakes. There was a promise—the noble Viscount will remember it—that the Criminal Justice Bill, which was not passed in the end, would cover that element. It included intent, like the current offence—the one that has been incorporated into Schedule 7. The Private Member’s Bill of the noble Baroness, Lady Owen—I have it in my hand—explicitly introduces an offence that does not require intent, and I very much support that.
I do not believe that this is the last word to be said on the kinds of IIA offence that need to be incorporated as priority offences under Schedule 7. I would very much like to hear what the noble Baroness has to say about why we require intent when, quite frankly, the creation of these deepfakes requires activity that is clearly harmful. We clearly should make sure that the perpetrators are caught. Given the history of this, I am slightly surprised that the Government’s current interpretation of the new offence in the Online Safety Act includes deepfakes. It is gratifying, but the Government nevertheless need to go further.
My Lords, I welcome the Minister’s remarks and the Government’s step to introduce this SI. I have concerns that it misses the wider problems. The powers given to Ofcom in the Online Safety Act require a lengthy process to implement and are not able to respond quickly. They also do not provide individuals with any redress. Therefore, this SI adding to the list of priority offences, while necessary, does not give victims the recourse they need.
My concern is that Ofcom is approaching this digital problem in an analogue way. It has the power to fine and even disrupt business but, in a digital space—where, when one website is blocked, another can open immediately—Ofcom would, in this scenario, have to restart its process all over again. These powers are not nimble or rapid enough, and they do not reflect the nature of the online space. They leave victims open and exposed to continuing distress. I would be grateful if the Government offered some assurances in this area.
The changes miss the wider problem of non-compliance by host websites outside the UK. As I have previously discussed in your Lordships’ House, the Revenge Porn Helpline has a removal rate of 90% of reported non-consensual sexually explicit content, both real and deepfake. However, in 10% of cases, the host website will not comply with the removal of the content. These sites are often hosted in countries such as Russia or those in Latin America. In cases of non-compliance by host websites, the victims continue to suffer, even where there has been a successful conviction.
If we take the example of a man who was convicted in the UK of blackmailing 200 women, the Revenge Porn Helpline successfully removed 161,000 images but 4,000 still remain online three years later, with platforms continuing to ignore the take-down requests. I would be grateful if the Government could outline how they are seeking to tackle the removal of this content, featuring British citizens, hosted in jurisdictions where host sites are not complying with removal.
(7 months, 3 weeks ago)
Lords ChamberI thank my noble friend for her question. We are absolutely determined to keep children safe online and to use the Online Safety Act to provide protection across all the categories under its jurisdiction. Ofcom’s draft guidance lays out which technologies could constitute, for example, highly effective age assurance to protect children, and it will have a full range of enforcement powers to take action against companies that do not follow the duties, including substantial fines. I absolutely agree with my noble friend that robustness is key here. I think some people are frustrated that some of the duties in the Online Safety Act are taking time to be rolled out, but it was a feature of the Act that it would be done on that basis. We are very keen, as everybody in the House is, to see it enacted in full as soon as it can be.
My Lords, the Revenge Porn Helpline has a removal rate of 90% of non-consensually shared intimate content, including deepfake. However, in 10% of cases, the host site will not comply with its removal, even where there has been a successful conviction. These sites are often hosted in Russia and Latin America, and are unlikely to come under Ofcom’s scope, even with the changes that make sharing a priority offence. Can the Minister inform the House what action the Government are taking to address non-compliance, and does she agree that it would be better adopt a rapid and wide-ranging approach—favoured by victims—to deem NCII content illegal, thus giving internet service providers the power to block it?
I thank the noble Baroness for her continuing interest in this issue and her campaigning work. The Government have already put forward secondary legislation to ensure that the new intimate image abuse offence is made a priority under the Online Safety Act, and all other acts of deepfake portrayal will come under the Act if they are illegal. Going back to the earlier question about robustness, we absolutely expect Ofcom to implement those protections in a robust way.
(8 months, 3 weeks ago)
Lords ChamberMy noble friend makes the important point that international co-operation is absolutely vital. We continue to talk to all our friends across the globe, exchanging information and making sure that best practice arises from those discussions.
My Lords, research by Vodafone found that algorithms are pushing content to boys related to misogyny and violence following innocent and unrelated searches. Can the Minister say whether the Government are looking into how these algorithms have been used not only to push misinformation and disinformation but to push people towards and reinforce more extreme views?
My Lords, deepfakes and other forms of manipulated media are captured by the Online Safety Act where they constitute illegal content or harmful content to children in scope of the regulatory framework. Under the Act, all companies will be forced to take action against illegal content online, including illegal misinformation and disinformation, and they will be required to remove in-scope content. These duties will also apply to in-scope AI-generated content and AI-powered features.