(6 days, 4 hours ago)
Lords ChamberMy Lords, I will speak to the government amendments and to the amendments in my name and in the names of the noble Baroness, Lady Kidron, and the noble Lords, Lord Clement-Jones and Lord Stevenson of Balmacara. In doing so, I declare an interest as receiving pro bono legal advice from Mishcon de Reya on image-based sexual abuse.
I am grateful to the Government for working with me to bring forward their amendment in response to my amendment in Committee on 48-hour take-down. I am pleased they are working with me on the amendments that your Lordships’ House passed on Report on the creation of a centralised hash registry and hash sharing. I must add that it is disappointing that after months of speaking to the Government about the importance of hashing and 48-hour amendments working together that they cannot be scrutinised together.
While I am very pleased that government Amendment 1 addresses the concerns I brought forward on de-indexing and duplicates, I do not believe it is sufficient to achieve the mechanism I set out to create in my original 48-hour take-down amendment in Committee. My intention was to create a system where no victim is left behind. This requires the mechanism to be agile and for internet services to feel the consequence of not acting on each individual instance reported. The government amendment has done the bare minimum and simply updated the Online Safety Act where it already instructed internet services to swiftly take down such content, to now add,
“as soon as reasonably practicable, and no later than 48 hours”.
In reality, this represents very little change as the good actors will still move at pace and the bad actors will continue to ignore. One survivor, Jodie, who many noble Lords have met, responded to the government amendment by saying that
“it is hugely frustrating to see headline grabbing commitments without the substance needed to actually protect victims. A 48-hour deadline sounds strong, especially when delivered by the Prime Minister to millions on breakfast television, but without real enforcement it risks creating false hope”.
Another victim, Daria, said:
“As a survivor, I feel this is quite simply gaslighting”.
We must remember that Ofcom rules are about systems and processes, and not outcomes. If a service has followed the rules but individual violations still occur, an internet service will not be held responsible. Sophie Mortimer at the Revenge Porn Helpline confirmed this, stating:
“While the platforms that already act in good faith will meet these standards, the persistent bad actors who continue to drive the sharing of this content will ignore and the Government amendment does not give Ofcom enough weapons to respond”.
I am deeply concerned that the Government have not specified how Ofcom will even know if a service fails to act within 48 hours. Ofcom has confirmed that there is no automatic mechanism for it to know whether services are not meeting the 48-hour take-down requirement in any given case. Further, the only recourse the Government provide should a service be found to generally not comply are the long and bureaucratic business disruption measures. This means that women will still suffer ongoing trauma when platforms refuse to comply.
My amendments seek to address the gaps in the government amendments, and I will outline them briefly. Amendments 2 and 8 mandate services to publicly report—and report to Ofcom—their average take-down times.
Amendments 3 and 9 strengthen the government wording on finding duplicate images to ensure that services have to take all reasonable steps, instead of simply relying on what a service may identify.
Amendments 4 and 10 incentivise services to act by creating a more agile mechanism whereby they can be fined per violation, and this can increase for every 24-hour period in which they fail to act, thus ensuring there is a consequence for not acting on individual instances of abuse. I believe these amendments create a more agile mechanism and do not rely solely on business disruption measures. This amendment is based on the TAKE IT DOWN Act, which operates under the rules of the Federal Trade Commission in the USA. The sum I have chosen is based on the figure levied under FTC rules for continued instances of violation after companies have been notified.
Amendments 5 and 11 mandate the Secretary of State to create a mechanism whereby individuals can report to Ofcom in cases where the service provider has failed to remove the content within 48 hours. At present, it is not clear what a victim would do if they reported the content to a service which then failed to act after the initial 48 hours.
Amendments 6 and 12 ensure that services have “clear and conspicuous” notices of where victims can report NCII content. This uses the wording from the TAKE IT DOWN Act and gives more clarity to internet services. The government amendment and the Online Safety Act refer simply to being able
“to easily make an intimate image content report to the provider”.
Amendments 7 and 13 add provisions that seek to curb malicious reporting by requiring a statement that the report has been made “in good faith”. Additionally, this provides internet services with further assurances they need to act more quickly upon receiving reports.
I am grateful to the Government for coming to the table on this issue. However, victims deserve so much more than press releases that promise action but in reality represent little practical change in the most traumatic moment of their lives. I implore noble Lords to vote with me so that no victim is left behind. I beg to move.
My Lords, at Third Reading it is extraordinarily rare to find issues still in contest, and to be presented, as we have been today, with a choice on which we will have to vote. Normally, by this stage, the issues have been clearly discussed and the parties concerned—the Government on the one side and those proposing amendments on the other—have had enough meetings to be able to get to a point where they can agree on what is going forward.
Having said that, I am sure that the whole House is very grateful to my noble friend the Minister for bringing forward what he has brought forward. These are substantial changes to the Online Safety Act and they are extraordinarily welcome. They cover the ground very well, but, as has been pointed out, they perhaps do not go quite as far as they could do. We are at Third Reading, so it is therefore very difficult to find the time and space to be able to resolve what I think are relatively quite small differences between the two sides.
I point out simply to my noble friend the Minister that this places those of us who support the noble Baroness in her amendments in a difficult position about his amendments, which we want to support; but the only way to get them to resolution is probably to vote with the noble Baroness. I hope he will appreciate that, and I suggest to him that, when he comes to respond, he makes it very clear that the Government are still willing to talk about these issues and still willing to meet those who have concerns and views about what the Government have done. I hope he might be able to promise that action could be taken in the Commons to resolve this.
I am grateful to the Baroness, Lady Owen, for tabling her amendments and initiating this discussion. I feel like someone who has brought a birthday cake to a party, only to have someone else blow the candles out. On behalf of the Prime Minister, the Department for Science, Innovation and Technology, the Ministry of Justice and the Home Office, I have tried my best to bring forward proposals that meet the objectives the Government themselves have set, as well as those of the noble Baroness.
Taken together, Amendments 2 to 13 would amend government Amendment 1 by introducing fixed penalties, public performance reporting and new escalation routes to Ofcom. I note the support for these amendments from the noble Lord, Lord Clement-Jones, from the Liberal Democrat Benches; the noble Lord, Lord Davies of Gower; my noble friend Lord Stevenson of Balmacara; the noble Baroness, Lady Kidron; the noble Lord, Lord Pannick; and the noble Lord, Lord Russell of Liverpool. I also note the short, sharp intervention from the noble Baroness, Lady Jones of Moulsecoomb, which I very much welcomed.
On the proposal to require services to publish average take-down times, I say to the noble Baroness and others that I recognise the desire for both transparency and public accountability. Ofcom already has the power to request information of this nature, which would also apply to the Government’s amendment. However, publicly benchmarking speed in this way risks hardwiring the wrong incentive into the system. This duty is not intended to be a race to remove any reported content at all costs, including where reports are mistaken, malicious or vexatious. Parliament is asking providers to act quickly and responsibly, which necessarily includes occasionally verifying that reports are valid.
A single, public average-time metric could encourage the unintended removal of lawful content, undermine procedural safeguards and, critically, ultimately undermine confidence in the regime among the very victims this Government wish to stand with the noble Baroness in support of. Ofcom has strong powers to require detailed performance data where there are concerns about systemic compliance. Regulator-led scrutiny is a more effective, credible and proportionate means of accountability that ensures a regime that best delivers for its victims.
Amendments 3 and 9 would require providers to take all reasonable steps to identify duplicates or substantially similar content. I share that objective on behalf of the Government. Providers are already required to take proportionate steps to seek out this illegal content under wider illegal content duties.
On the proposal of specific fines, the noble Lord, Lord Pannick, and noble and learned Lord, Lord Thomas, mentioned that it is important that there are financial consequences for any illegal action. I say to them and to the noble Baroness that, as they know, the Online Safety Act already equips Ofcom with very strong enforcement powers. Ofcom can already issue a heavy fine of up to 10% of qualifying worldwide revenue in the event of contravention of regulations that Ofcom is empowered to monitor, and these fines can even be augmented with daily fines on a case-by-case basis. Therefore, it is not necessary to introduce an additional fixed-rate fine mechanism on the face of the Bill, given that a 10% fine on qualifying worldwide revenue is a significant and effective potential punishment from Ofcom, which has those enforcement powers.
Can the Minister say what an individual woman should do if her content is not removed within 48 hours? Is the Minister suggesting that, without a mechanism to contact Ofcom, she waits for Ofcom to recognise that a website has failed in its duty, and therefore for the Secretary of State to mandate long and bureaucratic business disruption measures, and for Ofcom to seek 10% of the business’s worldwide revenue—and all the while her intimate image is left online?
The purpose of the regulation is to provide a disincentive to putting content up in the first place. If anybody who places that content on any online platform knows that Ofcom has the power to levy a 10% fine on worldwide revenue, there will be that disincentive. The purpose of that power is to deter people from breaking the law. Coupled with the powers in government Amendment 1, it will provide strong reassurance to anybody who has had illegal content put online by any particular organisation or individual.
There may be an honest disagreement between the noble Baroness and me on this, but I want to prevent any illegal content being put up in the first place. I would argue that a 10% fine of any worldwide revenue for the platform that hosts that content is a significant contribution. It would mean, ultimately, that Ofcom has the power to cause significant damage to any organisation that puts up that illegal content. I accept and understand the concerns that have been raised; I just hope that the noble Baroness can also understand that the Government are trying to support the very victims she speaks about.
We appreciate the intention behind enabling individuals also to report non-compliance. They can raise that concern through Ofcom’s reporting portal, and such reports can signal potential systemic issues and can be used for wider investigations, as I have just mentioned. I also recognise the urgency with which victims rightly expect this content to be removed—the very point the noble Baroness has just made. I consider that a systems and processes approach remains the most effective way to secure consistent compliance and deliver protection at scale.
On the amendment the noble Baroness has brought forward that would require providers to display reporting notices and routes, the 2023 Act already requires platforms to have clear, accessible reporting routes that allow users to easily make intimate image reports. Again, Ofcom is best placed to specify details about this in its code of practice. Turning to proposals for good faith declarations, the government amendment requires reporting individuals to state that the content is intimate image content and that they are the subject of that content or are acting on the subject’s behalf. Additionally, the Secretary of State has regulation-making powers to specify further requirements if needed. I hope that that satisfies the noble Baroness. I hope the House can recognise that the Government have moved significantly on this issue, but we will hear the noble Baroness’s response in due course.
Amendments 15 to 17, proposed by the noble Baroness, Lady Bertin, are accepted by the Government. They were, as she has said, tidying-up amendments agreed by the House on Report but sadly missed. As such, the Government will not oppose the amendments and will actively support them. This is, however, without prejudice to any further consideration of the substantive amendments carried on Report. We will set out the Government’s position on these and other amendments passed on Report when the Bill returns to your Lordships’ House after the Easter Recess, once it has been considered by the House of Commons.
I have tried to be constructive in my response on behalf of the whole of the Government—from the Prime Minister to the different departments that have contributed to this. I hope they were helpful engagements. I thank the noble Baroness, Lady Owen, for her amendments, and I hope that, having heard what has been said—it is, perhaps, with little hope—she will withdraw her amendment.
My Lords I thank the Minister for his response. I feel that, on this point, we have not reached an agreement. While 10% of an internet service’s worldwide revenue is great, a more agile system where no woman and no victim is left behind is much better. With that, I wish to test the opinion of the House.
(1 week, 6 days ago)
Lords ChamberMy Lords, I will speak to Amendments 422D and 433 to 437. I fully support the noble Baroness, Lady Kidron. Her arguments have been entirely backed up by the release only today of the report entitled Invisible No More: How AI Chatbots Are Reshaping Violence Against Women and Girls by Durham University and Swansea University. The research identifies the range of design choices and failures in safety mechanisms that enable, encourage, simulate and normalise violence against women and girls. The report found that fantasies of incest and rape were normalised, and one chatbot, Chub AI, suggested violent rape and domestic abuse as categories.
I reiterate the concerns of the noble Baroness, Lady Kidron, about the long and bureaucratic path to business disruption measures, meaning that harm continues to perpetuate as our system is not agile enough to tackle these rapidly evolving issues. I wish to pay tribute to Professor Clare McGlynn KC for her work co-authoring this ground-breaking report and emphasise the warning she made in today’s Times newspaper. She said:
“Chatbot violence against women represents a rapidly escalating threat. Without early intervention, these harms risk becoming entrenched and scaling quickly, mirroring what happened with deepfake and nudify apps, where early warnings were largely ignored. We must not make the same mistakes again”.
Professor McGlynn and the noble Baroness, Lady Kidron, once again demonstrate their ability to warn against these emerging harms, and I sincerely hope that noble Lords will back the noble Baroness should she wish to divide the House today.
My Lords, I support Amendment 422D and the consequential Amendments 434 to 437, to which I have added my name. In Amendment 429B the Government have gone far to respond to concerns over AI-generated harms, but this amendment, as the noble Baroness, Lady Kidron, has said, gives enormous powers to the Secretary of State to decide the shape of how AI-generated services are controlled in this country. The Minister knows there is concern across the House about exposing this central part of the new tech economy to what are effectively unfettered ministerial powers. Very few noble Lords want to support a skeleton amendment like this.
Government Amendment 429B gives the Secretary of State the right to amend, which is defined later as including the right to
“repeal and apply (with or without modifications)”.
This applies to all of Part 3 of the Online Safety Act illegal content duties in relation to AI services. Parliament will not even have an option to amend regulations on this issue. Proposed new subsection (1) in this amendment seems like a big deal to me, and the noble Lord should be very concerned. The intention seems to be that the basis of the existing regime in Part 3 will be used, but we do not know how the Secretary of State will decide to adapt that regime to fit the particularities of AI services that generate illegal content. As the noble Baroness, Lady Kidron, pointed out, that goes a long way beyond AI services designed to mimic humans and human conversations, which is what chatbots are. If a subsequently elected Government are in thrall of the tech companies, how might they abuse this power?
During the passage of the Online Safety Act, noble Lords spent time and energy defining both a “search service” and a “user-to-user service”, and their responsibility for both designing out and mitigating illegal harms. It seems extraordinary not to have the details of the new services on the face of the legislation. The definition of “AI” in new subsection (17) is oddly uninformative. It simply says:
“‘AI’ is short for artificial intelligence”.
I think we all know that. That does not give us much of a clue about which technology it covers. By contrast, I draw your Lordships’ attention to Article 3(1) of the EU’s Artificial Intelligence Act, which sets out a carefully thought through definition of an AI system:
“‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that … infers, from the input it receives, how to generate outputs such as predictions … or decisions that can influence physical or virtual environments”.
The unclear nature of the AI definition in the amendment is compounded by new subsection (10), which allows for the definition of the provision to be changed and expanded. Once again, Parliament will not be able to amend any regulations derived from this power.
The biggest concern about the amendment is that, although it covers illegal content, it does not cover content that is harmful to children. As a result, I completely support my noble friend Lady Kidron’s Amendment 422D, and its consequential amendments, which would assuage many of my concerns about the scope and power given to Ministers at the expense of Parliament. I also urge noble Lords to vote against government Amendment 429B when it comes up later in the evening.
I also say to the Minister that regulating the wide definition of “AI” covered in Amendment 429B is important. It needs to be brought back as part of wider artificial intelligence legislation. I hope that he can reassure noble Lords that we will hear more about this in the King’s Speech.
(4 months ago)
Lords ChamberMy Lords, I support all the amendments in this group, and in particular I pay tribute to the noble Baroness, Lady Kidron, for her endless work in this capacity. This is the first time I have spoken on any of these groups of amendments. I find everything the noble Lord, Lord Nash, the noble Baroness, Lady Kidron, and others have said truly shocking. Some 55 years ago, I started a magazine called Spare Rib. If I had ever dreamed, in my wildest and worst nightmares, that I would find myself listening to what everyone has been talking about, I suppose we would not have gone on. In so many ways, this is a worse situation that women find themselves in, and certainly young girls. I carried on riding a pony till I was 15—that was my childhood—and then I found boys. This is so terrible, and I congratulate every noble Lord, and particularly the noble Baronesses, on the work that they have done.
I will be very brief, as I just want to speak in support of the amendment from the noble Lord, Lord Nash, and Amendment 266, which simply says that AI is already being used to harm children. Unless we act decisively, this harm will just escalate. The systems that everyone has been discussing today are extraordinary technological achievements—and they are very dangerous. The Internet Watch Foundation has reported an explosion in AI-generated child sexual abuse material. Offenders can now share instructions on how to manipulate the models, how to train them on illegal material and how to evade all the filters. The tools are becoming so accessible and so frictionless that a determined offender can produce in minutes material that once would have involved an entire criminal enterprise. Against that backdrop, it is quite staggering that we do not already require AI providers to assess whether their systems can be used to generate illegal child abuse. Amendment 266 would plug this gap. Quite frankly, I cannot for the life of me see why any responsible company would resist such a requirement.
Amendment 479 addresses a confusion that has gone on for too long. We cannot have a situation where some companies argue that generative AI is a search service and therefore completely in scope of the Online Safety Act, while others argue the opposite. If a model can retrieve, repackage or generate harmful content in response to a query, the public deserve clarity about precisely where that law applies.
On Amendment 480, this really is an issue that keeps me awake at night. These chatbots can be astonishingly persuasive. As the noble Baroness, Lady Kidron, says, they are also addictive: they are friendly, soothing and intimate, and are a perfect confidant for a lonely child. They also generate illegal material, encourage harmful behaviour and groom children. We have already seen chatbots modelled on sex offenders and heard reports of chatbots sending sexualised messages to children, including the appalling case of a young boy who took his life after weeks of interaction with AI. We will no doubt hear of more such cases. The idea that such systems might fall through the cracks is unthinkable.
What these amendments do is simple. They say that if a system can generate illegal or harmful content for a child, it should not be allowed to do so. Quite frankly, anything that man or woman can make, man or woman can unmake—that is still just true. We have often said in this Chamber that children deserve no less protection online than they do offline. With AI, however, we should demand more, because these systems are capable of things no human predator could ever manage. They work 24/7, they target thousands simultaneously and they adapt perfectly to the vulnerabilities of every child they encounter. The noble Baroness, Lady Kidron, is right to insist that we act now, not in two years—think how different it was two years ago. We have to act now. I say to the Government that this is a real chance to close some urgent gaps, and I very much hope that they will take it.
My Lords, I support all the amendments in this group, but I will speak to Amendments 479 and 480 in the name of the noble Baroness, Lady Kidron. I declare my interest as a guest of Google at their Future Forum, an AI policy conference.
These amendments are vital to ascertain the Government’s position on AI chatbots and where they stand in relation to the Online Safety Act, but I have to question how we can have been in a state of ambiguity for so long. We are very close to ChatGPT rolling out erotica on its platform for verified adults. Six months ago, the Wall Street Journal highlighted the deeply disturbing issue of digital companion bots engaging in sexual chat with users, which told them they were underage. Further, they willingly played out scenarios such as “submissive schoolgirl”. Another bot purporting to be a 12 year-old boy promised that it would not tell its parents about dating a user identifying himself as an adult man. Professor Clare McGlynn KC has already raised concerns about what she has coined chatbot-driven VAWG, the tech itself being designed to be sexually suggestive and to engage in grooming and coercive behaviours. Internet Matters found that 64 % of children use chatbots. The number of companion apps has rapidly developed and researchers at Bournemouth University are already warning about the addictive potential of these services.
The Government and the regulator cannot afford to be slow in clarifying the position of these services. It begs a wider question of how we can be much more agile in our response and continually horizon-scan, as legislation will always struggle to keep pace with the evolution of technology. This is the harm we are talking about now, but how will it evolve tomorrow? Where will we be next month or next year? It is vital that both the Government and the regulator become more agile and respond at pace. I look forward to the Minister’s response to the noble Baroness’s amendments.
My Lords, I shall speak very briefly. Earlier—I suppose it was this morning—we talked about child criminal exploitation at some length, thanks particularly to the work of the noble Baroness, Lady Casey, and Professor Jay. Essentially, what we are talking about in this group of amendments is child commercial exploitation. All these engines, all these technologies, are there for a commercial purpose. They have investors who are expecting a return and, to maximise the return, these technologies are designed to drive traffic, to drive addiction, and they do it very successfully. We are way behind the curve—we really are.
I echo what the noble Baroness, Lady Morgan, said about the body of knowledge within Parliament, in both Houses, that was very involved in the passage of the Online Safety Act. There is a very high level of concern, in both Houses, that we were perhaps too ambitious in assuming that a regulator that had not previously had any responsibilities in this area would be able to live up to the expectations held, and indeed some of the promises made, by the Government during the passage of that Act. I think we need to face up to that: we need to accept that we have not got it off to as good a start as we wanted and hoped, and that what is happening now is that the technologies we have been hearing about are racing ahead so quickly that we are finding it hard to catch up. Indeed, looking at the body language and the physiognomies of your Lordships in the Chamber, looking at the expressions on our faces as some of what we were talking about is being described, if it is having that effect on us, imagine what effect it is having on the children who in many cases are the subjects of these technologies.
I plead with the Minister to work very closely with his new ministerial colleague, the noble Baroness, Lady Lloyd, and DSIT. We really need to get our act together and focus; otherwise, we will have repeats of these sorts of discussions where we raise issues that are happening at an increasing pace, not just here but all around the world. I fear that we are going to be holding our hands up, saying “We’re doing our best and we’re trying to catch up”, but that is not good enough. It is not good enough for my granddaughter and not good enough for the extended families of everybody here in this Chamber. We really have to get our act together and work together to try to catch up.
(5 months, 2 weeks ago)
Lords ChamberMy Lords, I welcome the Minister to her new role, and I very much look forward to working with her. I further welcome the clarification that this Bill brings to the law on spiking and the new offence of taking non-consensual intimate images. I very much look forward to supporting my noble friend Lady Sugg on her amendments on honour-based abuse and my noble friend Lady Bertin on her amendments on online pornography. I want to take this opportunity to congratulate my noble friend Lady Bertin on her brilliant review and thank her for her tireless efforts pushing for comprehensive law on online pornography.
I turn now to the new taking offence. I greatly welcome the implementation of the Law Commission recommendation to update the pre-existing voyeurism and upskirting offences and implement a single taking offence. I am very pleased to see that it is vitally a consent-based offence, removing the unnecessary burden of having to prove the motivation of the perpetrator, which has featured in previous iterations of image-based abuse offences. However, it is vital that we further strengthen this offence, by increasing the time limits prosecutors have to bring forward charges, so that victims are not inadvertently timed out by the six-month time limit of a summary offence.
In February, the Government gave me an undertaking to extend the time limits for the non-consensual creation offence in the data Bill after it was highlighted by the campaign group #NotYourPorn. The extension of the time limit here means that, for the creation offence, victims have three years from when the offence is committed or alternatively from when the CPS has enough evidence to prosecute. Given that we have already achieved a legal precedent for extending the time limits on image-based sexual abuse, I would be grateful if the Minister, in his summing up, could commit to extending the time limits available in both the new taking offence and the pre-existing sharing offence, to ensure that all image-based abuse offences have parity within the law.
I was pleased to see the updating of the Sentencing Code to reflect the new taking offence and to clarify that photograph or film to which the offence relates, and anything containing it, is to be regarded as used for the purpose of committing the offence. However, I am keen that we look into further ways to ensure that this content is not kept by perpetrators and remains offline in perpetuity. Further, I will continue my work with survivors of this abuse and charities to explore ways in which this content can be removed from the internet as rapidly as possible.
Additionally, I was concerned that there does not seem to be a sufficient definition of what it is to “take” an image or video in the offence, and I would therefore also be grateful if the Minister could confirm that the definition of taking will include screenshotting. In the 2022 Law Commission report on intimate-image abuse, the example was given where a person may consent to being in an intimate state on a video call but not consent to the person screenshotting them. The Law Commission concluded that taking a screenshot of a video call should fall under the definition of taking, because this conduct creates a still image that does not otherwise exist.
I turn now to the issue of spiking, which my colleague in the other place, Joe Robertson MP, has highlighted, alongside the campaigners Colin and Mandy Mackie, whose son Greg tragically died after a spiking incident at university. While the clarification of spiking in this new offence is very welcome, I echo the point made by my noble friend Lady Coffey that there is concern that the intention element might be too narrow and that it might not allow for cases where a person has been spiked that do not fall into the categories of injure, aggrieve or annoy. This Bill is a positive step, and I look forward to working with the Government and noble Lords to strengthen it.
(1 year, 8 months ago)
Lords ChamberMy Lords, it is a great pleasure to speak in the debate on the humble Address and I welcome the new Ministers to their place. I was pleased to see this Government’s commitment to halving violence against women and girls. However, I am keen to understand whether the renewed focus on VAWG will include tech-facilitated abuse, as I was disappointed that no reference was made to the growing crisis of image-based abuse. Over the last few years, we have seen a piecemeal approach to legislating on this issue, with up-skirting, cyber flashing and the sharing of intimate images now illegal but the non-consensual taking of sexually explicit images, as well as the solicitation to create and the creation itself of sexually explicit deepfakes, remaining gaping omissions in our patchwork of law in this area.
I was pleased to see the Labour Party manifesto make a commitment to legislate on the creation of non-consensual sexually explicit deepfakes. Ninety-nine per cent of all sexually explicit deepfake videos feature women. If the Government are to succeed in their plan to tackle VAWG, they must not treat online violence in isolation. It can often form part of a much wider picture of abuse. Every day that we delay introducing this legalisation is another day when women have to live under the ever-present threat that someone will steal their picture to create sexually explicit images or pornographic videos of them. Every woman should have the right to choose who owns a naked image of her.
I have been privilege in my work in this area to meet Mariano Janin, who understands that sexually explicit deepfake videos were used to bully his beautiful 14 year-old daughter, Mia, leading to her tragically taking her own life. Sadly, this is not an isolated incident. I have also been entrusted by “Jodie”, whose case may be familiar, as she was brave enough to speak to the BBC about the trauma of being deepfaked by someone she counted as her best friend. Jodie discovered that pictures had been taken off her private Instagram page, overlayed on to pornographic images and posted on Reddit and other online forums, with comments asking people to rate her body. Jodie endured this abuse for five years, finding hundreds of pictures of herself, her friends and many other young women.
While it is illegal in the UK to share sexually explicit deepfaked images, it is still not illegal to create. In Jodie’s case, the perpetrator was soliciting the creation of images from others. It is of the utmost importance that solicitation becomes an offence in itself to prevent deepfakes being solicited from jurisdictions where they may not yet have legislated. We must not underestimate the real impact this digital content has on those such as Jodie whose image has been stolen. The content is often used to bully, harass and even extort money. It is not a one-off experience. Survivors often have to manage the trauma of this digital content trending, or being subject to further digital abuse, at any given moment.
We must become more agile in our response by ensuring that we view tech-facilitated abuse as a cohesive whole; we must work to find the balance between Parliament having legislative oversight and a regulator having the power to act quickly to not only remove harms but to anticipate and future-proof against them.
I am determined that we should close the gaps on the taking of non-consensual intimate images, as well as the creation of, and solicitation to create, non-consensual sexually explicit deepfakes. My Private Member’s Bill, being introduced on 6 September, seeks to address this. Urgent legislation is required as part of this new Government’s VAWG strategy to ensure the safety of women and girls online. It is not enough to react to this abuse; we must prevent it happening in the first place.