Caroline Nokes
Main Page: Caroline Nokes (Conservative - Romsey and Southampton North)Department Debates - View all Caroline Nokes's debates with the Home Office
(5 months, 3 weeks ago)
Commons ChamberI thank my hon. Friend, because the speech he gave on Second Reading played a major role in the changes we are introducing today. I reassure him that the change brings into scope most sexual assault cases, terrorist cases and racially aggravated offences, and I confirm to him that the specific case he raised on Second Reading would have been brought into scope by the change for which he has campaigned. I remind the House that the sanction for non-attendance at a sentencing hearing is up to a maximum of another two years in custody.
Government new clause 86 creates an offence of creating a sexually explicit deepfake of an adult without their consent. Members will be aware that the sharing of intimate images, whether real or fake, is already proscribed under the Online Safety Act 2023. We consider that we cannot complete the task of protecting people, principally women, unless we add the creation of pseudo-images or deepfakes to that package of protection. We are the first national legislature to take this step—if I am wrong about that, we are among the first—and we do so because we recognise the inherent risk posed by the creation of these images, both to the individual depicted and to society more widely.
I know that the Minister will have given thought to this, but does she agree that there is a problem not just with deepfake sexual images, but more widely with deepfake images that purport to show individuals and potentially even Members of this House doing and saying things that they have not and that have no sexual connotations whatever?
I am grateful to my right hon. Friend for raising that point. We are encountering a rapidly changing world of deepfake images that can be used for the purposes of manipulating voices to try to influence political attitudes and choices. I have to make it clear that the new clause is confined only to the creation of sexually explicit images. However, it is my hope, humbly expressed at this Dispatch Box, that it may provide a gateway and lever for the development of more law in this area, and I thank her for her intervention.
Before we proceed, I would like to make a couple of observations. These are very serious and sensitive issues that deserve, and are clearly going to get, proper debate. In his closing remarks, the hon. Member for Stockton North (Alex Cunningham) indicated that there are two days for this debate. Earlier, an hon. Member intervened on the Minister to raise a subject that she had not commented upon. There was a good reason for that: it is listed not on the order paper for today but on the order paper for the second day. I ask hon. Members to make quite sure that, when they are discussing these issues, they are discussing those listed on the order paper for today, in the understanding that there will be a second day.
There are 18 hon. Members wishing to speak. I may have missed one, so there may be more. At the moment, we have plenty of time but may I gently urge conciseness rather than self-indulgence? That relates particularly to interventions, which should be interventions and not speeches.
I call the Chair of the Women and Equalities Committee.
I rise to speak to amendment 160, tabled in my name and supported by members of the Women and Equalities Committee, and other colleagues across the House. I will endeavour to be as brief as I can and I reassure everybody that the amendment is on the order paper for today.
I thank my hon. Friend the Minister for her comments on deepfakes. There has been a problem: someone like Taylor Swift can get a deepfake made using their image taken down very quickly, but for ordinary women, or indeed men, from across the UK, who are not famous and do not have a platform, it is very difficult to get deepfake imagery removed. I welcome the steps the Government are taking on that.
I thank the shadow Minister, the hon. Member for Stockton North (Alex Cunningham), for his comments about the amendment. I was not aware that the Opposition were planning to support it, so I thank him for that. I urge my hon. Friend the Minister to pay close attention to what I and other members of my Select Committee will say about the amendment. I recognise that the amendment comes at the eleventh hour, on Report, for which I apologise to my hon. Friend. The reason for that is specifically because of the evidence the Committee heard last week, both in private and in public, from victims of revenge porn.
I welcome the changes that have been brought in under the Online Safety Act to support victims of non-consensual intimate image abuse. However, from the evidence we heard, it is clear that the legislation, in its current form, does not go far enough. It does not give Ofcom the teeth it needs to effectively tackle the fast-spreading, uncontrollable virus that is non-consensual intimate image abuse. It does not force platforms to remove harmful content in its entirety, or require internet service providers to block access to it. In short, it does not make the content itself illegal. The sharing of it is illegal but, even if there is a criminal conviction, the content itself is not regarded as illegal content.
Last week, the Women and Equalities Committee heard from a number of survivors of non-consensual intimate image abuse. In sharing their experiences with us, they have spoken of the catastrophic damage the abuse has had on their lives, confidence and relationships. They told us of their fear of applying for jobs, meeting new people or daring to have any social media presence at all. With all their cases, there was a common theme: even though they had secured a conviction against their perpetrator, their non-consensual content continues to circulate on the internet. Despite relentless work by organisations, such as the Revenge Porn Helpline, to report the content and get it taken down, there is no legal obligation for platforms to remove it.
I thank my right hon. Friend the Chair of the Select Committee for making an excellent point, which supports the point I made earlier. If the Bill had a consent-based creation offence in it, that would outlaw the images that the people she is talking about find so difficult to get off the internet. Surely the Bill provides the opportunity to introduce a consent-based creation offence, rather than the current proposal that potentially provides lots of loopholes, particularly to online apps, to use intention to try to evade the long arm of the law.
My right hon. Friend’s point is exactly right that the issue is consent. In my view, when images are non-consensual, they should be regarded in the same way as if the individual had been digitally raped.
There are also many thousands of cases where a conviction has not been achieved or even sought, where the victim just wants the content taken down or blocked. They too are being denied that peace of mind due to gaps in the current legislative framework. The amendment calls for non-consensual intimate photographs or film to be added to the list of “priority offences” in the Online Safety Act, thus making it “priority illegal content”. The amendment would ensure that non-consensual content, regardless of whether or not a conviction had been achieved, would be, by its non-consensual intimate nature, illegal. It would place duties on platforms to remove it, and require internet service providers to block access to non-compliant sites and platforms, including those hosted outside the UK.
That is precisely the way in which child sexual abuse material is handled. Children cannot provide consent and the adults in these images have not provided their consent for them to be taken, shared or both, so why should the content be treated so differently? Indeed, when the hon. Member for East Renfrewshire (Kirsten Oswald) put it to my hon. Friend the Minister during her recent appearance before my Committee, that adult content should be handled in the same way as child sexual abuse material, via a registry to identify, classify and therefore allow for the removal of non-consensual intimate images, the Minister said it would be “a very good idea”. In order to do that, we need to make the content illegal.
It is important to note that intimate imagery does not just refer to photos and videos that are sexually explicit. Indeed, as we heard from David Wright, chief executive of South West Grid for Learning, which runs the Revenge Porn Helpline, within certain countries and cultures, being photographed with an arm around somebody or being filmed without a hijab can have catastrophic implications for a woman. That is why it is so important that any legislative change uses the term “intimate”, not “sexual”, when referring to non-consensual content.
Last week, we heard evidence from Georgia Harrison, who famously was the victim of revenge porn perpetrated by her then partner, Stephen Bear, who later received a criminal conviction for his actions and was sent to prison. Georgia made the point repeatedly that what happened was like “a house fire”, because when the images went up they spread very quickly. The solution was to get them taken down as quickly as possible so that they would not proliferate. The Committee described it as being like a virus that spreads out of control. The issue is not just about Georgia Harrison or famous women who have a platform they can use to ensure their voice is heard.
We also heard from an anonymous victim of Operation Makedom. In that case, the perpetrator had many thousands of victims. He received a 32-year prison sentence, but that young woman is too afraid to have any sort of social media presence because she is terrified that her image will be seen and put through reverse image searches so she will be identified as a victim. Thousands and thousands of the Operation Makedom images still proliferate online and nothing can be done about that because the content itself is not illegal. It remains online and accessible for people in the UK, despite that 32-year prison sentence. That cannot be right. We will be letting down the victims of that abuse, and all other cases of non-consensual intimate image abuse, if we fail to act.
My final point to the Minister is that we also heard about the Criminal Injuries Compensation Authority and the fact that intimate image abuse is not on its list as a violent crime. When someone applies to the authority, expecting or hoping for some small nugget of compensation—a message in effect that they are a victim, they can put the blame and shame to one side, and they have been a victim of a criminal act—that is not even there for them. I have no doubt that is because the list of violent criminal offences was dreamt up many moons ago and intimate image abuse simply has not been added to it. It should be added to the list. As I said earlier, for a woman, or indeed a man, who has had their intimate images put online, circulated freely and proliferated all over the place, that is like digital rape. It is a rape that continues day after day, to be brutally honest, with no end in sight.
Those are the reasons why my Committee has tabled this amendment and why we urge Members to support it and give it serious consideration. I hope that my hon. Friend the Minister will be able to make some comments from the Dispatch Box that might indicate how the MOJ can incorporate such provisions into existing law. If the message coming back to me is that the content is already illegal, I must say that it is not. We must find better ways of getting it down from online platforms.
She has just popped out. She made an outstanding speech, which illuminated and identified yet more of the nefarious ways that child abusers find to conduct some of the most serious offences against children. She knows, as was clear in her constructive speech, that artificial intelligence raises unique problems. I agree without hesitation with the force of what she said, and about the identification of an offence as she has presented it. I recognise that it is our duty as parliamentarians to future-proof our legislation, and I thank her for her detailed work on this issue. I commit to working with her and to trying as best we can to get something ready for Report in the other place.
I pay tribute to my hon. Friend the Member for Carshalton and Wallington (Elliot Colburn) for the sensitive and thoughtful way in which he approached the Law Commission’s report and the issue of hate crimes, and for his new clause 32 to introduce protected characteristics to the Crime and Disorder Act 1998. Of course, I have read the Law Commission’s excellent report on this matter, and I can confirm that a response to it was always forthcoming this year. I want to make two slight qualifications that might explain some of the delay.
Many Members will be aware that the Law Commission did not recommend making sex a protected characteristic for hate crimes, and may remember that there was a campaign to make misogyny a hate crime, which the commission rejected. That required careful thought, because not all the protected characteristics have been treated in the same way. Another issue is the implementation of the hate crime legislation in Scotland, which has been both highly contentious and, I am afraid, somewhat chaotic. Of course, we wish to avoid replicating those mistakes. However, I want to provide reassurance by saying that our intention is to deal with this matter—subject to all the normal approvals—in the House of Lords, and I hope that my hon. Friend the Member for Carshalton and Wallington will come and work with me on it.
The other excellent speech that I want to refer to was that of my right hon. Friend the Member for Romsey and Southampton North (Caroline Nokes). She alighted on two important issues—cyber-flashing and intimate image abuse—that are not on the priority offences list in schedule 7 to the Online Safety Act 2023. That is not because we did not consider them important or sinister offences—she will need no persuading, given everything that we have done on intimate image abuse, that the opposite is true. The fact is that they were not on the statute book, or certainly had not been commenced, when we passed the 2023 Act. I know that the Secretary of State is well aware of that, particularly in relation to both those issues. I know that my right hon. Friend is conducting an urgent review as we speak, and I am sure that, in the weeks ahead, I will be able to update her on where we are on this. I do not want her to think for a moment that we are dragging our feet.
I appreciate that my hon. Friend is seeking to give me an assurance from the Dispatch Box, but it is perhaps not quite as fulsome as I would wish. She says that the priority offences register can be reviewed. It would be very helpful if we had a specific timescale by which the measures could be added. That would give reassurance to all victims that such images will be made illegal in their in own right, and that Ofcom and internet service providers will work together to take them down. We already have the criminal offence, so the perpetrators can go to prison, but the victims want the images—the repeat offence—to be removed from the internet.
I listened very carefully to what my right hon. Friend said, and I agree with every single word of it. Some of this sits with the Department for Science, Innovation and Technology, as she knows, so I would need to have a conversation with the relevant Minister, but I feel as strongly as she does on this matter, and I assure her from the Dispatch Box that I will use my best endeavours.
The road traffic amendments, which I will talk about briefly, were beautifully presented during the Committee and again today. I have spoken a few times with the Members who tabled them, who are well aware that those matters sit with the Department for Transport. I understand that they have had engagement with the Department and that an important review of this issue has certainly been contemplated.