Digital Image Abuse Debate

Full Debate: Read Full Debate
Thursday 2nd December 2021

(2 years, 11 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate
Maria Miller Portrait Mrs Maria Miller (Basingstoke) (Con)
- View Speech - Hansard - -

It is a great pleasure to speak in this Adjournment debate. There can be few things more harmful, traumatising or abusive than for an individual to have a nude or sexually explicit image shared without their consent with thousands or even millions of people online. It is a horrific invention of the online world and an act of sexual abuse because it is done without the consent of the victim.

Technology is being used every day to invent new and even more grotesque ways of inflicting abuse, particularly sexual violence, especially against women and girls. I have secured this debate on deepfake and nudification image abuse because they are yet more forms of abuse against women and girls, their impact is not understood, and they continue to be largely unrecognised, especially in law and the legal sanctions that are available. It is a great pleasure to see the Parliamentary Under-Secretary of State for Justice on the Front Bench, underlining the Government’s understanding of the need to address this issue.

For those who are unfamiliar with the term “deepfakes”, they are pornographic images that are created by merging existing pornographic content with the image of an individual—usually a woman—who has not given her consent. The resulting pornographic material is often violent, including illegal depictions of rape. In a similar way, nudification software takes everyday images—again, usually of women without their consent—and uses an extensive database of pornographic images to create a new image that makes it appear as though the original subject of the photo is nude.

The decision to create and share a deepfake or a nudified image is a highly sinister, predatory and sexualised act undertaken without the consent of the person involved. It has been a growing problem for the past 10 years, along with other forms of intimate image abuse. Reports of such abuse have grown by almost 90% in the past 12 months, coinciding—not coincidentally—with the lockdown, the pandemic and the changes in behaviour that are leaving many people at home for longer periods.

All forms of intimate image abuse have a significant and long-term impact on their victims, but I believe that deepfakes and nudification are particularly pernicious because the images are almost completely fabricated, causing psychological harm, anxiety, depression, post-traumatic stress disorder—the list goes on. Some people experience an impact on their physical health or damage to their relationships. There may also be damage to their financial situation because of the need to take time off work or perhaps withdraw altogether from the online world, which we all know is a fundamental part of most people’s jobs in modern society. In some cases, there have been reports of self-harm and even attempted suicide among those who have been a victim of this heinous act.

I would like to turn specifically to the impact on individuals. This horrific abuse can happen to absolutely anyone, as a constituent of the hon. Member for Sheffield Central (Paul Blomfield) discovered in 2019 when she learned that her image had been uploaded to a pornographic website—an ordinary image from her social media that was then manipulated with software to make it appear as if she were in something completely pornographic. She was only alerted to the existence of the photos by an acquaintance after the images had been in circulation for years. The original images were taken from her social media, including photographs from her pregnancy.

I commend the hon. Member’s constituent, because she has had the courage to speak out about something that many cannot or feel unable to speak about. We can understand that much more closely when we hear her words explaining how she felt. She said that the images were “chilling” and that she still experiences nightmares. Speaking of the experience, she said:

“Obviously, the underlying feeling was shock and I initially felt quite ashamed, as if I’d done something wrong. That was quite a difficult thing to overcome. And then for a while I got incredibly anxious about even leaving the house.”

That reaction is typical; it leaves many women frightened to seek the help that they need.

Another victim—I will call her Alana, although that is not her real name—was identified by Professor Clare McGlynn in her work on “Shattering Lives and Myths”, a report on the issue. Alana also had faked intimate images widely circulated without her consent. Her testimony is equally harrowing; I will quote from it, because her words are powerful and the Minister needs to hear them if he is to bring the right solutions to this place. She said:

“It has the power to ruin your life, and is an absolute nightmare, and it is such a level of violation…because you are violated not only by the perpetrator but you also feel violated because society doesn’t recognise your harm. It is a very isolating experience, it is a very degrading and demeaning experience, it is life-ruining.”

Those are words that we should all listen to as we move forward, hopefully, to some solutions.

At the moment, deepfakes are not against the law, and people who use nudification software are not recognised as sexually abusing others. Deepfakes have been a shocking development in violence against women online. Let us be clear: this technology is almost exclusively used to inflict violence against women. Indeed, the cyber research firm Sensity found that 96% of all deepfakes are pornographic and that all the pornographic deepfakes it detected—100%—targeted women.

Offline, non-consensual sexual acts are recognised in the criminal law through the crimes of sexual assault, sexual abuse, rape—the list goes on, yet those responsible for developing and using technology in the online world and through artificial intelligence have been allowed to operate perniciously and with impunity, inflicting online sexual attacks on women and girls without criminal consequences. We cannot allow the online world to be a continuum of the offline world where women and girls experience even further new forms of sexual abuse and violence, which is why we need a new law to criminalise the taking, making and sharing of nude and sexual images without consent, including deepfakes and nudification. Those, surely, are some of the worst forms of such activity.

This technology is no longer the reserve of movie CGI experts. Image manipulation can be incredibly technical, but nowadays creating content of this kind is dangerously easy. With the development of nudification apps that can be downloaded on to a phone, anyone can create an indecent image of somebody without their consent in seconds. Apps and websites like these are not hidden in the recesses of the dark net, undiscovered; they are receiving millions of visitors. In the first seven months of 2021, one nudifying app received a staggering 38 million hits. This service has an interesting slogan: it is to

“make all men’s dreams come true”.

I am sure that is not the case, because I know that many of my hon. Friends would find this as abhorrent as I do. The app allows users to undress thousands of women without their consent, and runs an “incentive program” for users who share the links to their deepfakes, so users who get clicks on their deepfakes can nudify more pictures faster. It is disgusting, but it is not against the law. We have to act.

Deepfakes are widely regarded by academics as the future of violence against women online, but the existing law is woefully behind and largely redundant. Section 33 of the Criminal Justice and Courts Act 2015, the so-called revenge porn legislation, in whose drafting I was involved, was a good step in the right direction, but it specifically excludes altered or photoshopped images and videos; and there are shortfalls in the current law because it does not adequately capture all motivations for non-consensually taking or sharing an intimate image. Although motivations such as sexual gratification and causing distress are covered, if the image that is being nudified was not originally private or intimate in nature, and if it was not shared directly with the individual in the photograph, it can be interpreted by law enforcement agencies as not having the intention of harassing or causing distress, even if it is shared with thousands of people on the internet. That is clearly an absurdity that needs to be changed.

Threats to share images have now been included in the Domestic Abuse Act 2021, and the Government are to be applauded for making that change, but if no threat to share is made, there is the potentially ridiculous scenario that the image could be legally shared if the motivation to share it was a joke, because such motives are not recognised in the current law.

I hope I have explained why it is so critical for the Online Safety Bill to effectively mitigate violence against women and girls online by introducing new criminal offences—and I would say that they should be sex offences—of the taking, making and sharing of intimate images without consent. I know that the Online Safety Bill is very popular—we heard about that in the previous debate—but perhaps the Government should be thinking of a set of Bills to be introduced together, rather than trying to put everything into one Bill. There might be a suite of Bills to tackle all the different issues, to prevent the risk of making one Bill so expansive that it becomes what is commonly known as a Christmas tree Bill. It is an innovative approach, which I am surprised that the Government do not take more often when dealing with highly complex areas that are interrelated.

In the tackling violence against women and girls strategy, the Government have committed to root out offending online as well as offline. They cite the forthcoming Online Safety Bill as the instrument in their efforts to do this, but reform of the laws on intimate image abuse is not yet included in the Bill. This oversight needs to be addressed before the Bill comes back to this House for debate, which we hope will be in the very near future. The current law is not fit for purpose. It is a patchwork of different elements based on defined motivations that can make prosecutions more difficult and that fails to recognise the nature and impact of image-based abuse online. If we have an Online Safety Bill that does not tackle the gaps in the criminal law, it will be a Bill that falls well short of what our constituents need.

The Law Commission has already developed a wide range of recommendations for legal reform in this area that are widely supported by industry stakeholders and experts, so I urge the Government to fast-track those recommendations through the Online Safety Bill, in recognition of the fact that we cannot wait any longer for legal reform. We need deepfake and the use of nudification apps to be outlawed in a comprehensive new law to criminalise the making, taking and sharing of intimate sexual images without consent. This change is long overdue, and I know that this Minister understands that point. I look forward to hearing his response to the debate.

--- Later in debate ---
James Cartlidge Portrait James Cartlidge
- Hansard - - - Excerpts

As I was saying, a person who shares deepfake images of adults may in some circumstances be committing an existing offence. For example, against a background of domestic abuse, the posting or sharing of faked images could be captured under section 76 of the Serious Crime Act 2015. That offence was created specifically to target controlling or coercive behaviour in an intimate or family relationship, including when the victim is an ex-partner. We are aware that deepfake images are being used for such disturbing and cruel purposes.

In addition, section 1 of the Malicious Communications Act 1988 prohibits the sending of an electronic communication that is indecent, grossly offensive or false, or that the sender believes to be false, if the purpose, or one of the purposes, of the sender is to cause distress or anxiety to the recipient. Furthermore, section 127 of the Communications Act 2003 makes it an offence to send or cause to be sent through a

“a public electronic communications network a message or other matter that is grossly offensive or of an indecent, obscene or menacing character”.

The same section also provides that it is an offence to send or cause to be sent a false message

“for the purpose of causing annoyance, inconvenience or needless anxiety to another”.

Such behaviour may also amount to harassment, which is also already an offence.

There has been a successful conviction in which a person was found guilty of harassment after uploading images of a colleague, fully clothed, alongside images on a porn site of women of a similar shape and build as the colleague. Additionally, those who encourage others to commit an existing communications offence may be charged with encouraging an offence under the Serious Crime Act 2007.

I stress, though, that the Government recognise the concerns, set out so eloquently and clearly by my right hon. Friend, about the existing communications offences. The Law Commission considered the specific offences I have set out as part of its “Modernising the Communications Offences” review, to understand whether they needed to be reformed to better tackle abusive and harmful behaviours online. The Commission has now published its final report and recommendations for reform, and my right hon. Friend the Secretary of State for Digital, Culture, Media and Sport has indicated that she is minded to adopt the harm-based offence, the false-communications offence and the threatening-communications offence.

Alongside the use of existing and established criminal sanctions, there is a major role for the websites that host the images. It is encouraging that sites such as Pornhub, Twitter, Reddit and several others have all announced bans on deepfake images. Such images already violate community standards on major social media platforms such as Facebook. Some sites are already beginning to turn to artificial intelligence to police the images, rather than rely on users reporting them—an example of the determination to find effective and new ways to restrict the practice. For example, Facebook uses machine learning and AI to detect near-nude images or videos shared without permission on its platforms. Bumble, a dating app, has its own “Private Detector” safety feature, which automatically blurs a nude image shared in a chat. These are important steps to protect user safety and ensure that the images are tackled head on.

I hope that my right hon. Friend is satisfied that the law can, in most scenarios, deal with this behaviour, and that non-criminal interventions are developing all the time, but it is of course crucial that the criminal law keeps pace with new technologies as they emerge. We continue to keep these issues under review and when we see a problem with the criminal law, we act.

This Government have a strong record when it comes to protecting the public from the abuse of private, intimate imagery. For example, much as a result of my right hon. Friend’s assiduous campaigning, as she said earlier, in 2015 we created the so-called revenge porn offence at section 33 of the Criminal Justice and Courts Act 2015, and only recently, during the passage of the Domestic Abuse Act 2021, we listened to the voices of victims of image abuse and supported provisions to extend that offence to capture those who threaten to disclose private sexual images with an intent to cause distress. That change has now been implemented and I am sure that my right hon. Friend, having fought so hard for the creation of the original offence, welcomes that significant extension of the protection of victims from image-based abuse. In addition, after listening to the victims of upskirting and the excellent campaign for change headed by Ms Gina Martin, we created new criminal offences in the Voyeurism (Offences) Act 2019 specifically to address that intrusive and distressing behaviour. Offenders now face up to two years behind bars, and the most serious among them will be subject to sex offender notification requirements. We do listen and we do respond.

Maria Miller Portrait Mrs Miller
- Hansard - -

My hon. Friend has clearly gone through the shopping list of laws that can be used to try to guard against the misuse of intimate images, but in having a shopping list we have created a lot of gaps, too. For instance, upskirting may be unlawful but down-blousing is not. It is very difficult when we have law that is so prescriptive. Does he have sympathy with the need to have something more encompassing so that we can capture all forms of intimate image abuse and not have to play whack-a-mole by outlawing the latest devious way in which people try to abuse women and girls online?

James Cartlidge Portrait James Cartlidge
- Hansard - - - Excerpts

My right hon. Friend makes an excellent point, and once again she highlights her incredible expertise on these matters. She will be aware that the way Parliament often works is that individual campaigns generate momentum and become specific offences—I would not use the phrase “ad hoc,” which is almost demeaning to those campaigns, which are incredibly important and powerful. That is the reality of how this place makes law at times, but she is right that we need to consider the broader picture. I know where her focus is, and I will be coming to the Law Commission, which will feed into that point.

My colleagues in the Department for Digital, Culture, Media and Sport are busy preparing the online safety Bill, which will include provisions to tackle illegal and legal-but-harmful content, including criminal deepfake pornography, sexual harassment and abuse that does not cross a legal threshold. Under the Bill all companies will need to take action against illegal content and ensure that children are protected from inappropriate material. Major platforms will also need to address legal-but-harmful content for adults, which will likely include online abuse. Ofcom will have a suite of enforcement powers to deal with non-compliance, including fines of up to £18 million or 10% of qualifying annual turnover.

The Joint Committee that is scrutinising the Bill is due to report before recess—by 10 December. We will table the Bill as soon as possible, subject to the parliamentary timetable, but we must not rest. I assure the House that we do not take concerns in this sensitive area lightly.

It was with those concerns in mind that the Government asked the Law Commission to review the law on the taking, making and sharing of intimate images without consent, to identify whether there are any gaps in the scope of protection already offered to victims. Importantly, the review has considered the law on manipulated images such as those created by deepfake technology and the protection that the existing law affords.

On 27 February 2021 the Law Commission published the consultation paper on its review, and the consultation ended on 27 May and put forward a number of proposals for public discussion. I understand the Law Commission is due to publish its final recommendations by spring 2022.

Although I welcome this opportunity to discuss the nature of developing technology and the production and sharing of explicit manipulated images and other offences, this is a complex area and it is right and proper that we should take time to consider the law carefully before deciding whether to add further to the raft of existing legislation that already addresses these issues. It is important, therefore, to allow the Law Commission to finish its work and to consider in detail and with care any recommendations it produces. The Government await the Law Commission’s findings with interest and will consider them carefully.

I believe my right hon. Friend has previously met the Law Commission but, if it would be of interest, I would be more than happy to arrange for her to do so again, based on its latest position.

--- Later in debate ---
Maria Miller Portrait Mrs Miller
- Hansard - -

I am slightly taking advantage of the fact that we have a little more time this evening. The Minister will know that the Law Commission has made its recommendations, which have gone out for consultation. That consultation finished a month or two ago, so it is not that the Law Commission will finish its deliberations in the spring; it has already finished its deliberations. Those recommendations, subject to any input from the consultation, should be available shortly. I still do not understand why he is not able to bring these recommendations forward at the same time as the online safety Bill.

James Cartlidge Portrait James Cartlidge
- Hansard - - - Excerpts

My right hon. Friend makes a good point. I wish to clarify this, as a lot of Law Commission reviews are taking place over time. There are two in this regard. The one I believe she is referring to is the one I mentioned earlier, which is the Department for Digital, Culture, Media and Sport one. I believe that has reported and that the Department is now considering it, and it concerns malicious communications and other offences to which I referred earlier. The review on taking, making and sharing is ongoing and will report in spring next year. The point I was making to her was that if she wanted to contribute to that and meet the Law Commission—

Maria Miller Portrait Mrs Miller
- Hansard - -

indicated assent.

James Cartlidge Portrait James Cartlidge
- Hansard - - - Excerpts

My officials have noted her positive nodding of the head, and so I would be more than happy to set that meeting up, because she has great expertise. I can assure her that her concerns, and the views and issues raised by this House, will be taken fully into account when the Government consider those findings and the issue of whether reform to the criminal law is necessary.

Question put and agreed to.