Jess Phillips
Main Page: Jess Phillips (Labour - Birmingham Yardley)Department Debates - View all Jess Phillips's debates with the Home Office
(11 months, 1 week ago)
Public Bill CommitteesI missed Clare Wade’s evidence because I was unwell when she gave evidence to this Committee. Are we to assume that the clause will be used in the prosecution of cases where self-harm is caused by incidents within domestic abuse relationships or as a result of grooming, sexual violence and broader violence against women? I think that it was clarified during the evidence session that that was the case.
I thank the hon. Lady for her question. It is quite clear that Parliament’s intention, in the way that we are framing the clause, and how the clause might actually play out when it comes before the courts, are probably quite different. I have been thinking about that myself. This is very much an extension of what I may call—I hope you will forgive me if I use this as a shorthand—the “Molly Russell” principle, which was established by that tragic case and led to all the new principles of the Online Safety Act—bringing them into line with the offline environment.
However, I think that you are quite correct; when we read clause 11, we see that it belongs in a range of different circumstances, all of which I have thought through. Yes, I think that you are right to say that it could very easily exist within a domestic—
My apologies. I am sorry for being too informal; I am not familiar with this. I think that it is the case that the issue is readily identifiable within certain forms of domestic abuse scenario, and that the clause would apply in those circumstances. It is obvious in the statutory language.
I will speak more broadly about the issue in a moment, and I am pleased to hear what the Minister has said; that is what we would all want to see. However, I am concerned about the each-way offences that the Minister outlined. Let us say that in a case of suicide a coroner found that domestic abuse had been involved—I mean, chance would be a fine thing in most cases—and a manslaughter charge was laid and then the perpetrator pled guilty. There has only been one case of this. I just wonder how these summary limits and these each-way offences would work in that situation.
I thank the hon. Lady again for her question. Actually, I think that we would have to concede immediately that it would be on the charge sheet. However, the hon. Lady has raised the topical, important and very difficult issue of whether or not a domestic abuse perpetrator has elicited suicide in circumstances where, as she will know, there are evidential difficulties. There is a discussion happening within Parliament, and more widely within the legal profession, about the offence of manslaughter and its ambit when it takes place in the context of suicide.
Perhaps I can reassure the hon. Lady, though, by saying this: if we stop short of suicide—very much mindful of the fact that that engages quite difficult legal issues—and we think about the offences created under clause 11, I think that it is almost inconceivable that there would be a circumstance in which a clause 11 offence existed and was not accompanied by an offence of coercive control under the Domestic Abuse Act 2021. I just do not think that, in a domestic abuse context, those two things would not exist in parallel. Therefore I think that we would already be looking at a more serious form of sentencing if we were into an “eliciting self-harm” clause 11 offence. It would also be automatically brought under the ambit of the Domestic Abuse Act, and it is already a more serious offence in that context.
Clause 12 is the facilitation element of the offence, and subsection (1) provides that anyone who arranges for somebody else to do an act capable of amounting to inducing self-harm is also committing an equivalent offence. Subsection (2) provides that an act can be capable of encouraging or assisting self-harm even when done in circumstances where it was impossible for the final act to be performed. For example, if pills were provided to a person and they ended up not to be the pills that were intended, it is exactly the same offence. Equally, if something harmful was sent by post but never arrived, the offence and sentence are the same irrespective.
Subsection (3) provides that an internet service provider does not commit the offence merely by providing a means through which others can send, transmit or publish content capable of encouraging or assisting serious self- harm. Subsection (5) provides that section 184 of the Online Safety Act 2023 is repealed in consequence of these provisions, which create a much broader basis, bringing the online and offline environments into parity.
The Minister and I have had some back and forth on this. I rise really to hammer home the point regarding the good intentions of the clause, but the need to think about it in the context of a domestic abuse, grooming or sexual violence situation. It is undoubted in any professional’s mind that one of the consequences of violence, abuse and coercion against an individual, specifically in young women, is self-harm and suicide.
As the Minister rightly says, it is important that we recognise that in the vast majority of cases self-harm falls short of suicide. There is a huge amount of self-harm going on across the country, genuinely encouraged as a pattern of domestic abuse, and we need to ensure that this piece of perfectly reasonable legislation, which was designed for those on the internet trying to get people to be anorexic and all of that heinous stuff, which we are all very glad to have not had to put up with in our childhood—I look around to make sure that we are all of a relatively similar age—also covers that.
There is one particular risk: how does the clause interact with institutions? Perhaps the Minister could assist me with that. The Minister for Crime, Policing and Fire, a Home Office Minister, is sat in front of me. I was a few minutes late for the sitting this morning because I was in court with one of my constituents in a case—I am afraid to say—where we were on the other side from the Home Office. My constituent literally had to take medication during the court proceedings, such is the mental health trauma that has been caused to her by the Home Office. I wonder how this piece of legislation might be used. I suppose I worry that there is too much opportunity for it to become useful, in that there are so many ways in which institutions and individuals cause people to end up in a self-harm and suicidal situation. I seek clarity on that, unless Ministers wish to be found wanting by the Bill.
I commend my hon. Friend the Member for Birmingham, Yardley for offering a powerful dose of reality about what is happening and the risks. We know that abusers will find every possible gap and try to use them to perpetrate their abuse and these heinous crimes. We must follow them and close those gaps the best we can—or, even better, get ahead.
Clauses 11 and 12 make good the recommendations of the Law Commission in its 2021 “Modernising Communications Offences” report. The Minister described that as important and I echo her comments. The clauses also finish what was started during consideration of the Online Safety Bill. We supported it at that point, and the Bill was well scrutinised, so I will not rehash that debate.
The Government amendments extend the provisions to Northern Ireland. I wonder whether there is a different story about Scotland, because most of the Government amendments expand provisions to Scotland as well as to Northern Ireland. I would be interested in the Minister’s comments on that.
I will finish on the point that my hon. Friend the Member for Birmingham, Yardley made about institutions. Throughout my time in Parliament, the issue of conversion therapies has been at the forefront. We wish that we were getting on with banning them today—goodness knows how much longer we will have to wait—but we know that very harmful self-harm practices can be part of those therapies. Will the Minister say, in responding to my hon. Friend the Member for Birmingham, Yardley, how accountability will fall in cases like that? That is important; if there is a gap for a certain organisation, perhaps we need to return to this. It might be that we will be assisted by the provision in clause 14 that, where a significant senior person in an organisation commits a crime, the organisation can be held accountable. Perhaps that is the way to close the gap—I do not know. I will be interested in the Minister’s view.
Well, okay, but I struggle to conceive of circumstances, other than very unusual and extreme ones, where it would be said that a statutory body was doing an act with the intention of eliciting the consequence of self-harm. Anyway, the point has been made and I have responded to it. I know the hon. Lady’s case is an emotive one.
I am not going to talk about my case, but with regard to the charge sheet, coercive control legislation does not currently cover adults who are sexually exploited in grooming situations. In the case of a woman who is sexually exploited by an adult, like the woman I was with this morning, coercive control legislation does not apply. However, self-harm—I mean, I am going to say that literally being forced to be raped by 20 men a day is self-harm—is absolutely part of the pattern of coercion and abuse that those people suffer, so we would assume that adult-groomers would be covered by the Bill.
I thank the hon. Lady for her intervention. I think a very helpful fabric of possible scenarios has been identified this afternoon. I simply say that in the different circumstances that she has just outlined, there are different criminal offences that would also apply. My simple point is that a case of the nature that she has described would not be confined to a section 11 offence under the Criminal Justice Act 2024, as I hope it will become in due course; there would be a range of serious criminality connected to that.
There isn’t. I hope, as the Minister hopes, that there will be by the time we have got to the end of our scrutiny of the Bill, but there is no crime of grooming adults in sexual exploitation; that exists only for children as an aggravating factor in offences. I suppose pimping legislation would not count in the case I mentioned if self-harm was caused. I do not think there are other bits of legislation for adult victims of sexual exploitation.
Order. We are having a very important and thoughtful debate, but can we please try to observe the normal procedures so that Hansard colleagues, and those who are watching, can catch all of the proceedings?
The clause is the latest in a sequence of legislation dealing with intimate image abuse. People may correct me if I am wrong, but I think I am right to say that we have not dealt with intimate image abuse until this Parliament. The first time it hit the statute book properly was the Domestic Abuse Act 2021. I think it is also right to say that, as a Parliament, we have framed it correctly as something that is more often than not just another ugly incarnation of coercive control. It is highly intrusive, humiliating and distressing conduct.
In November 2022, following the passage of the Domestic Abuse Act, the Government announced their intention to create a suite of new offences to deal with intimate image abuse, closely based on the Law Commission’s recommendations in its July 2022 report. Under the Online Safety Act 2023—I hope the Committee will not mind if I spend a moment on the chronology and the legislative journey on intimate image abuse—the Government repealed the offences of disclosing or threatening to disclose private sexual images, replacing them with four new offences of sharing or threatening to share intimate images.
The Bill goes further to tackle the taking of intimate images without consent, and the process of installing equipment for that purpose. First, it repeals two voyeurism offences related to voyeurism of a private act and taking images under a person’s clothing, for which we use the shorthand “upskirting”—although that precedes the life of this Parliament, so I am wrong about that. Anyway, both those offences are reasonably new and have resulted in amendments to the Sexual Offences Act 2003. The Bill will replace them with new criminal offences to tackle the taking or recording of intimate images without consent and the installing of equipment for such purposes.
Those taking offences build on the sharing offences identified in the Online Safety Act to provide a unified package of offences using the same definitions and core elements. That addresses the criticism that there was previously a patchwork of protection, which the police told us led to gaps in provision when it came to this type of behaviour. I pay tribute to my right hon. Friend the Member for Basingstoke (Dame Maria Miller), who is not a member of the Committee. She has done a lot of work on the issue, and identified this problem in particular. As we know, one of the issues was proving intent.
I am grateful to the Law Commission for its work. It consulted widely with the police, prosecutors and legal practitioners, so we could not only read its report, but hear from a range of experts, including those supporting and campaigning on behalf of victims, and others who are far more knowledgeable than any of us.
The clause will insert a suite of new provisions after section 66 of the 2003 Act. The clause will create three new offences: the taking or recording of an intimate photograph or film without consent; and two new offences about installing equipment to enable a taking offence. I will go through them briefly.
The first provision of the clause is the creation of what we call a base offence of taking any intentional image of a person in an intimate state without their consent. That amounts to what we will call a section 66AA offence. It removes the requirement for a reason or motive. It does not matter if the person was doing it for a joke or for financial payment, or even if their reason was not particularly sinister. The base offence would be met if those elements were established. The offence is triable summarily only and will attract a maximum prison term of six months.
The wording of the two more serious offences mirrors some of the language that we are familiar with; the offences refer not just to “intentionally” taking an image, including of a person in an intimate state without their consent, but to having the intent of causing them “alarm, distress or humiliation”, or taking the image for the purpose of “obtaining sexual gratification” for themselves or another person. The offences are serious and carry a maximum sentence of two years. The three offences are designed to achieve the right balance between the protection of the victim and the avoidance of any over-criminalisation. I will return to that when I speak to new clause 20, tabled by the hon. Member for Birmingham, Yardley.
The base taking offence is subject to a defence of reasonable excuse, such as a police officer taking an image without consent for purposes connected with criminal proceedings. Similarly, a base sharing offence is subject to the defence of reasonable excuse; for example, images taken for the purpose of a child’s medical treatment would meet that threshold, even if the victim was distressed by that. There is another exemption—I do not know who came up with this example, but it is a good one—if the image is taken in a public place and the person shown in the image is in the intimate state voluntarily. A distinction is therefore drawn between, for example, a photo of a streaker at a football match, and that of someone who had a reasonable expectation of privacy; that would relate to upskirting, for example.
We are also creating two offences to do with the installation of spycams, which I am afraid we see more and more of in cases going through the courts: an offence of installing, adapting, preparing or maintaining equipment with the intention of taking or recording intimate photograph or film; and an offence of supplying for that purpose. To be clear, it will not be necessary for the image to have been taken; if equipment was installed for that purpose, that is enough to meet the requirements of the offence.
Overall, the clause amends the Sexual Offences Act 2003 to ensure that notification requirements can be applied, where the relevant criteria are met, to those convicted of the new offence of taking for sexual gratification and installing with the intent to enable the commission of that offence. I commend the clause to the Committee. I will respond to the new clause later.
I will be brief. New clause 20 would extend the definition of “intimate image” to include specific categories of image that may be considered intimate by particular religious or cultural groups—for example, instances of a person not wearing modesty clothing such as a hijab or niqab when they would normally do so.
I am very sensitive to the issues that have been raised and will respond to them, but I will also explain why we do not accept the new clause.
We have steered very close to the course recommended by the Law Commission in what we have defined in law as an intimate image. It includes anything that shows a person who is nude or partially nude, or who is doing anything sexual or very intimate, such as using the toilet. It is a wider definition of “intimate” than was used in the revenge porn provisions under the Domestic Abuse Act 2021. We have expanded it, but we have confined it to what we think anyone in this country would understand as “intimate”.
One of the challenges in adopting a definition of “intimate” that includes, for example, the removal of a hijab is that we are creating a criminal offence of that image being shared. It would not be obvious to anyone in this country who received a picture of a woman they did not know with her hair exposed that they were viewing an intimate image and committing a criminal offence. The Law Commission has made very similar points in relation to showing the legs of a woman who is a Hasidic Jew, or showing her without her wig on. This would be grotesquely humiliating for that victim, but that would not be completely obvious to any member of the public who might receive such an image of them.
I will, but I would like to develop this point a little bit more.
I strongly suggest that the hon. Lady does not come from the same community as me. I described images being sent to the community; the nature of the image would absolutely be clear to lots of people where I live.
I was going to complete the point. If the hon. Lady will forgive me, I will do so before I give way again. We have to create laws that apply equally to everybody in the United Kingdom. If we are to create an offence of sharing intimate images, we have to have a translation of intimacy that is absolutely irrefutable to anybody sending that image around. Even if they do not know the person in the image, it has to be absolutely clear to the sender that they are sending an intimate image. I have already made the point that it would not be immediately obvious to everyone in the United Kingdom that an image of a woman showing her hair was a humiliating image of her. It would not automatically be an intimate image even if the person sharing it knew that the woman in the image was Muslim, because some Muslim women do not wear headscarves.
The hon. Member for Birmingham, Yardley described a very dark case. She mentioned the language of blackmail and honour-based violence. She intimated coercive control. My simple point is that in the circumstances she has identified, there are a host of serious criminal offences being committed in conjunction with the use of the intimate image. We would say, very respectfully, that we think that kind of crime belongs much more comprehensively within other offences.
I am not going to engage in a case-by-case discussion. It is so difficult for me to do that; I do not have the papers in front of me. I understand the issue about community-based events, but if the purpose of sending the image is to blackmail a person, they have already engaged another element of the criminal law, and there is already aggravation, in that the perpetrator is being domestically abusive or is committing an honour-based offence, as the hon. Lady described.
I want to make it clear that by introducing the base offence, this legislation is removing the need to show an intention to cause distress. That is the issue that Georgia Harrison had, but managed to circumvent when she got that very successful and high-profile conviction against Stephen Bear, who went to prison for two years. She had an evidential difficulty in proving intent in her case. Although she did, she then became a really powerful advocate for removing intent from the offence, and we have done so.
I am not for a moment suggesting that there will not be cases of maximum sensitivity in which somebody is humiliated, but as I say, in the case that the hon. Member for Birmingham, Yardley described, in the background, other offences were materialising. Our view is that it is more appropriate that they are dealt with under other elements of the law, rather than our muddling the police response, or even creating offenders where we do not mean to, because under the hon. Lady’s offence, the offender does not know they are committing an offence. They might think that they are sharing an image of a glamorous woman, not knowing that it is grossly offensive that they have shown a picture of a woman who does not have her hair covered as she normally would, because they do not know her.
I hope that answers the hon. Lady. With great respect, I urge her not to press her new clause. However, I would like to hear from her, because I did not give way to her a moment ago.
The rules allow the hon. Member for Birmingham, Yardley, to come back again—and the Minister can, in fact, respond again, if she would like to.
I understand exactly where the Minister is coming from. I understand not wishing to over-criminalise anybody for something accidental. I will just say that chance would be an absolutely fine thing. In the case that I was talking about, the police laughed at the woman when she went to them about it. Sometimes we on these Committees say, “Well, there’s already an offence for that,” and I think, “Is there?” In real life, there is not, when the rubber hits the road. I am not sure how many times people in this room have tried to get these criminal cases across the line. I do it every single week. In my life, I have done thousands and thousands.
The argument is the same for this legislation: what is the point of having it? Take Georgia Harrison’s case—let me give her a shout out. Good luck to her on “Love Island: All Stars”. I will definitely be supporting her; she is a friend of mine. There are probably all sorts of bits of legislation around posting an image of an ex partner. We say about spiking, “Well, there is already legislation for that,” but it does not work. Our job is to try to make laws that work in real life. I am afraid to say that there will be lots of cases of the kind that I am talking about. There just will, and the women involved will not be able to rely on this legislation.
The Minister said, “We try to make laws for all people in our country.” It does not always feel like that. We leave loads of people out. I will not press the new clause to a Division, because my point has been made. I am drawing a line in the sand when it comes to people in this Committee telling me, “There is another law for that,” when I know fine well that those other laws do not work.
The hon. Member for Wyre Forest makes a very good point. The reason that I stopped short of doing that is that I was trying to stay within the “intimate” framing, but he is absolutely right. As we go into an election year, we will see, both in the States and over here, that being a real challenge to our democracy and to how we conduct campaigning. This provision would certainly not be right for it, but a new clause might be. That is good inspiration from the hon. Member, and I am very grateful for it.
The Committee heard about this during the evidence sessions for the Bill. Dame Vera Baird, the former Victims’ Commissioner, made the point very powerfully. She said that this use of deepfakes
“needs making unlawful, and it needs dealing with.”––[Official Report, Criminal Justice Public Bill Committee, 12 December 2023; c. 62.]
Indeed, she said she could not understand why they had not been banned already, and I agreed with her on that point. Amendment 57 is designed to address that. It will make it an offence for someone to intentionally create or design
“using computer graphics or any other digital technology an image or film which appears to be a photograph or film of another person...in an intimate state”,
whether that be for “sexual gratification”,
“causing alarm, distress or humiliation”
or offences under the Sexual Offences Act 2003.
The amendment is an important addition to what we have. Some important progress was made with the Online Safety Act 2023, but I think this finishes the job. I am interested in the Government’s view on whether where they went with the Online Safety Act is where they intend to finish, as opposed to going that little bit further. I will close on that point, but I will be very interested in the response.
I rise to support both amendments, and, in fact, what the hon. Member for Wyre Forest said as well. No one should have the ability to host an image of a person that they did not want out there in the first place. Unfortunately, what people tend to get back is that it is very difficult to place these things, but all sorts of things around copyright are traced on all sorts of sites quite successfully. We put a man on the moon 20 years before I was born, and brought him back. I reckon we could manage this and I would really support it.
Turning to the point made by the hon. Member for Wyre Forest and the issue of faking intimate images, I am lucky enough to know—I am almost certain that most of the women in this room do not know this about themselves—that deepfake intimate images of me exist. As I say, I am lucky enough to know. I did not ever once consider that I should bother to try to do anything about it, because what is the point? In the plethora of things that I have to deal with, especially as a woman—and certainly as a woman Member of Parliament in the public eye—I just chalk it up to another one of those things and crack on, because there is too much to be getting on with. But on two separate incidents, people have alerted me to images on pornographic websites of both me and my right hon. Friend the Member for Ashton-under-Lyne (Angela Rayner); they have a thing for common women, clearly. There is nothing that even somebody in my position can do about it.
The first time I ever saw intimate images of me made on “rudimentary” Photoshop, as my hon. Friend the Member for Nottingham North called it, if I am honest, like with most abuses against women, I just laughed at it. That is the way we as women are trained to deal with the abuses that we suffer. They could only be fake images of me, because, unlike my children, I do not come from an era where everybody sends photos of everybody else naked. As a nation, we have to come to terms with the fact that that is completely and utterly normal sexual behaviour in the younger generation, but in that comes the danger.
The reality is that this is going to get worse. Rudimentary Photoshop images of me were sent to me about five years ago, or even longer—we have been here for ages. Covid has made it seem even longer. The first time I saw fake images of me, in a sexualised and violent form, was probably about eight years ago. Over the years, two, three or four times, people have sent me stuff that they have seen. I cannot stress enough how worrying it is that we could go into a new era of those images being really realistic. On the point made by the hon. Member for Wyre Forest, I have heard, for example, two completely deepfake recordings of my right hon. and learned Friend the Member for Holborn and St Pancras (Keir Starmer) that were put out and about. To be fair to Members on the Government Benches, they clearly said, “This is fake. Do not believe it; do not spread it.” We must have that attitude.
However, it is one thing to stop something in its tracks if it is the voice of my right hon. and learned Friend the Member for Holborn and St Pancras saying, in that instance, that he did not like Liverpool, but that is nothing compared with the idea of me being completely naked and beaten by somebody. It is like wildfire, so I strongly encourage the Government to think about the amendments and how we make them law.
Opposition Members have made two very good points, which I will respond to. The issue of publishing or hosting unlawfully obtained internet photographs is salient. It was probably thrown into its sharpest relief by Nicholas Kristof at The New York Times when he did a big exposé of Pornhub. I have never read off my phone in any parliamentary sitting before, but I will briefly do so, because the opening to his article is one of the best that I have read about Pornhub:
“Pornhub prides itself on being the cheery, winking face of naughty, the website that buys a billboard in Times Square and provides snow plows to clear Boston streets. It donates to organizations fighting for racial equality…Yet there’s another side of the company: Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering”.
The point is very well made.
Under the Online Safety Act 2023, we have ensured that all user-to-user services in scope of the illegal content duties are required to remove that type of illegal content online when it is flagged to them or they become aware of it. That would cover something such as the Pornhub apps I have described. We believe that the robust regulatory regime for internet companies put in place by the Act, with the introduction of the offence of sharing intimate images, which extends to publication, are the most effective way to deal with the problems of the spread of that material.
Our essential answer is that under the Online Safety Act a host site—I have given a big name, because I am critical of that particular site—would be under a legal obligation to remove content flagged to it as featuring prohibited content, so it would have an obligation under the law to remove an intimate image of an individual created without their knowledge or consent or to be subject to criminal sanctions. Under the Online Safety Act, those are substantial; Parliament worked collectively to ensure that meaningful sanctions would be applied in that regard.
There is a concern that creating a new offence would partially overlap with existing criminal offences—for example, that we would basically be duplicating some of the provisions under section 188 of the Online Safety Act. We worry that that would dilute the effectiveness with which such activity will be policed and charged by the Crown Prosecution Service. I understand that the provisions under the Act have not yet been commenced, so we would be legislating on top of legislation that has not been commenced. Respectfully, I invite hon. Members to allow the Act to come into force comprehensively before we make an assessment of whether we need to legislate again on the issue of hosting unlawful content. However, I am sympathetic to it, and I think the whole House agrees with the principle.
Equally, the Law Commission was asked to look at the issue of deepfakes, which it considered and responded to. I will remind the Committee of how it undertook its inquiry into the issue. It undertook a full public consultation on the point and engaged with the CPS and police, and it concluded that making a deepfake offence was not necessary. It identified certain associated risks, including difficulties for law enforcement and, again, the risk of overcriminalisation, which potentially would outweigh the benefits. The Government share the view of the Law Commission and have decided not to create a separate making offence.
I will provide hon. Members with some reassurance: nobody is in any doubt about the risk. The hon. Member for Birmingham, Yardley described harmful, culpable conduct relating to her personally and to other senior politicians in this House. My hon. Friend the Member for Wyre Forest gave hypotheticals that could easily materialise, and we all know that there is an increased risk of that as we move into an election year on a global scale, because elections are happening all over the world this year. Nobody doubts the risk. I want again to provide the reassurance that such conduct generally involves sharing of these images, or threats to share, both of which are criminalised by offences under the Online Safety Act, or by other offences—communication offences and harassment offences—so it is already captured.
The secondary issue identified by the Law Commission concern the prosecution difficulties, because it would be difficult to prove some elements of the offence, such as an intention to cause distress, in circumstances in which the image had not been shared—by the way, I take out of that a circumstance in which the defendant has told the victim that they hold the image, because that has already crossed the threshold. The question that I asked officials—I have now lost the answer, but they did give it to me. Hang on a minute; someone will know where it is. Will the Committee give me one moment?
I thank the hon. Gentleman for his forbearance. Just to pick up on that point, I think he is right to hold the Government’s feet to the fire on the commencement of the Online Safety Act, because it is all very well having these provisions in law, but if they are not actually operational, they are not doing any good to anyone. I accept that tacit criticism as it may be advanced. I recognise that implementation now is critical; commencement is critical.
I will disclose the question that I put to officials. I was interested in the question of what happens if, for example, a schoolboy creates a deepfake of another pupil and does not share it, so that it is not covered by the Online Safety Act but is none the less an offence. I am told that that is covered by two separate bits of legislation. One is section 1 of the Protection of Children Act 1978, which includes making indecent images of a child, including if that is a deepfake, which would be covered by the statutory language. The second provision is section 160 of the Criminal Justice Act 1988, which is possession of any indecent image of a child and would include where it had been superimposed.
I am satisfied that the current law, including the Online Safety Act—I have already accepted that there are commencement issues—deals with deepfakes. I am sensitive to the prosecutorial difficulties that I have identified and I think that these are covered, particularly by the Online Safety Act. We accept the Law Commission’s very careful work on the issue, which was a detailed piece of research, not just a short paragraph at the end. On that basis, I very respectfully urge the hon. Member for Nottingham North to withdraw or not press the amendments.
On the answer that the Minister got from her officials, there are so many bits of legislation about abuses of children, sexual violence towards children, sexual grooming of children and sexual exploitation of children, and there are none about adults, as though such behaviour is not harmful when someone turns 18. If the same kid in the same class is 17 and makes images of a person who is turning 18, the view is that one day it would be a problem and the next day it would not, as though the abuse of adult women is just fine. The Online Safety Act does not say the word “woman” once, so I will gently push back on the idea that it deals with this. I am going to scour Pornhub now—I will not do it while I am in Parliament in case somebody sees me—to look for these images, and I will rise to the Minister’s challenge. I am going to go to the police once the Online Safety Act is in force and we will see how far I get.
I thank the hon. Lady for her point. She is making very, very good ones, as she always does. That is a legitimate challenge. I just would also ask her to bear this in mind. She has heard our answer. First, we are accepting the Law Commission’s recommendation for now. Secondly, we think the Online Safety Act covers what she has described in terms of sharing. The third point that I draw her attention to is the pornography review launched today. That is a critical piece of work, and she made the good point that we focus extensively on children. There is a really important element of that.
First, we know that there is a dark web element where a lot of online pornography is focused directly on child pornography. We also know that adult pornography not only contributes to the pubescent nature of abuse that we see in the violence against women, but also violence against women much more widely. I have spoken about this; the hon. Lady has spoken about this—we have been in the Chamber together numerous times talking about it. I hope that that review will get on top of some the issues that she is raising today. I hope she will accept our gentle refusal of her amendment and maybe consider withdrawing it.
—informative and important. I would be very grateful if she could save them up and use them in her interventions so that we get them on the record, rather than overhearing them from a sedentary position, if she would be so kind.