Online Safety Bill (Tenth sitting) Debate

Full Debate: Read Full Debate
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I thank the Minister for tabling the amendments. In the evidence sessions, we heard about omissions in schedule 7 from not having Northern Irish and Scottish offences included. Such offences were included in schedule 6 but, at that point, not in schedule 7.

I appreciate that the Minister has worked with the devolved Administrations to table the amendments. I also appreciate the way in which amendment 126 is written, such that the Secretary of State “must consult” Scottish Ministers and the Department of Justice in Northern Ireland before making regulations that relate to legislation in either of the devolved countries. I am glad that the amendments have been drafted in this way and that the concern that we heard about in evidence no longer seems to exist, and I am pleased with the Minister’s decision about the way in which to make any future changes to legislation.

I agree with the position put forward by the hon. Member for Pontypridd. My understanding, from what we heard in evidence a few weeks ago, is that, legally, all will have to agree with the higher bar of the offences, and therefore anyone anywhere across the UK will be provided with the additional level of protection. She is right that the offence might not apply to everyone, but the service providers will be subject to the requirements elsewhere. Similarly, that is my view. Once again, I thank the Minister.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Briefly, I hope that the amendments provide further evidence to the Committee of the Government’s willingness to listen and to respond. I can provide the confirmation that the hon. Members for Aberdeen North and for Pontypridd requested: the effect of the clauses is a levelling up—if I may put it that way. Any of the offences listed effectively get applied to the UK internet, so if there is a stronger offence in any one part of the United Kingdom, that will become applicable more generally via the Bill. As such, the answer to the question is in the affirmative.

Amendment 116 agreed to.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Can my hon. Friend see any reason—I am baffled by this—why the Government would leave out human trafficking? Can he imagine any justification that the Minister could possibly have for suggesting that it is not a priority offence, given the Conservative party’s stated aims and, to be fair, previous action in respect of, for example, the Modern Slavery Act 2015?

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

It is an interesting question. Alas, I long ago stopped trying to put myself into the minds of Conservative Ministers—a scary place for any of us to be.

We understand that it is difficult to try to regulate in respect of human trafficking on platforms. It requires work across borders and platforms, with moderators speaking different languages. We established that Facebook does not have moderators who speak different languages. On the Joint Committee on the draft Bill, we discovered that Facebook does not moderate content in English to any adequate degree. Just look at the other languages around the world—do we think Facebook has moderators who work in Turkish, Finnish, Swedish, Icelandic or a plethora of other languages? It certainly does not. The only language that Facebook tries to moderate—deeply inadequately, as we know—is English. We know how bad the moderation is in English, so can the Committee imagine what it is like in some of the world’s other languages? The most terrifying things are allowed to happen without moderation.

Regulating in respect of human trafficking on platforms is not cheap or easy, but it is utterly essential. The social media companies make enormous amounts of money, so let us shed no tears for them and the costs that will be entailed. If human trafficking is not designated a priority harm, I fear it will fall by the wayside, so I must ask the Minister: is human trafficking covered by another provision on priority illegal content? Like my hon. Friend the Member for Aberdeen North, I cannot see where in the Bill that lies. If the answer is yes, why are the human rights groups not satisfied with the explanation? What reassurance can the Minister give to the experts in the field? Why not add a direct reference to the Modern Slavery Act, as in the amendment?

If the answer to my question is no, I imagine the Minister will inform us that the Bill requires platforms to consider all illegal content. In what world is human trafficking that is facilitated online not a priority? Platforms must be forced to be proactive on this issue; if not, I fear that human trafficking, like so much that is non-priority illegal content, will not receive the attention it deserves.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The first thing to make clear to the Committee and anyone listening is that, of course, offences under the Modern Slavery Act 2015 are brought into the scope of the illegal content duties of this Bill through clause 52(4)(d), because such offences involve an individual victim.

Turning to the priority offences set out in schedule 7 —I saw this when I was a Home Office Minister—modern slavery is generally associated with various other offences that are more directly visible and identifiable. Modern slavery itself can be quite hard to identify. That is why our approach is, first, to incorporate modern slavery as a regular offence via clause 52(4)(d) and, secondly, to specify as priority offences those things that are often identifiable symptoms of it and that are feasibly identified. Those include many of the offences listed in schedule 7, such as causing, inciting or controlling prostitution for gain, as in paragraph 16 on sexual exploitation, which is often the manifestation of modern slavery; money laundering, which is often involved where modern slavery takes place; and assisting illegal immigration, because modern slavery often involves moving somebody across a border, which is covered in paragraph 15 on assisting illegal immigration, as per section 25 of the Immigration Act 1971.

Modern slavery comes into scope directly via clause 52(4)(d) and because the practicably identifiable consequences of modern slavery are listed as priority offences, I think we do have this important area covered.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that the Minister thinks that there are other measures that cover this offence, but will he keep it under consideration going forward? I do not think that that is too much to ask. Part of the logic behind that is that some of the other issues, where the reasons behind them must be proved, are much more difficult to define or prove than the modern slavery offences that we are asking to be added here. Whether he accepts the amendment or not, will he commit to considering the matter and not just saying, “Absolutely no”? That would be helpful for us and the many organisations that are keen for such things to be included.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am happy to give that further consideration, but please do not interpret that as a firm commitment. I repeat that the Modern Slavery Act is brought into the scope of this Bill via clause 52(4)(d).

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a couple of questions for the Minister. The first is about the interaction of subsection (4)(c) and subsection (5). I am slightly confused about how that, because subsection (4)(c) states that anything that is not within the terms of primary priority content or primary content but is harmful to

“an appreciable number of children”

is included as

“content that is harmful to children”.

That is completely reasonable. However, subsection (5) excludes illegal content and content with a “potential financial impact”. I appreciate that these provisions are drafted in quite a complicated way, but it would be useful to have an understanding of what that means. If it means there is no harm on the basis of things that are financial in nature, that is a problem, because that explicitly excludes gambling-type sites, loot boxes and anything of that sort, which by their nature are intentionally addictive and try to get children or adults to part with significant amounts of cash. If they are excluded, that is a problem.

How will clause 53 be future-proofed? I am not suggesting that there is no future proofing, but it would be helpful to me and fellow Committee members if the Minister explained how the clause will deal with new emerging harms and things that may not necessarily fall within the definitions that we set initially. How will those definitions evolve and change as the internet evolves and changes, and as the harms with which children are presented evolve and change?

And finally—I know that the Minister mentioned earlier that saying, “And finally”, in a speech is always a concern, but I am saying it—I am slightly concerned about the wording in subsection (4)(c), which refers to

“material risk of significant harm to an appreciable number of children”,

because I am not clear what an “appreciable number” is. If there is significant harm to one child from content, and content that is incredibly harmful to children is stumbled upon by a child, is it okay for that provider to have such content? It is not likely to accessed by an “appreciable number of children” and might be accessed by only a small number, but if the Minister could give us an understanding of what the word “appreciable” means in that instance, that would be greatly appreciated.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

There are one or two points to pick up on. A question was raised about algorithms, and it is worth saying that the risk assessments that platforms must undertake will include consideration of the operation of algorithms. It is important to make it absolutely clear that that is the case.

The shadow Minister asked about the definition of harm, and whether all the harms that might concern Parliament, and many of us as parents, will be covered. It may be helpful to refer to definition of harm provided in clause 187, at the top of page 153. Committee members will note that the definition is very wide and that subsection (2) defines it as “physical or psychological harm”, so I hope that partly answers the shadow Minister’s question.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

However, I do not think that loot boxes even existed in 2005 when that Act was published. Loot boxes are gambling. They may not be covered by that legislation, but they are gambling. Will the Minister consider whether those harms are unintentionally excluded by clause 53?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We are getting into some detail here. In the unlikely event that any member of the Committee does not know what a loot box is, it is where someone playing a game can buy extra lives or enhance the game’s functionality somehow by paying some money. There have been some cases where children have stolen their parent’s credit card and bought these things in large numbers

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Having played lots of games, I can clarify that people do not know what they are getting with a loot box, so they are putting money forward but do not know whether they will get a really good piece of armour or a really crap piece of armour. It is literally gambling, because children do not know what will come out of the box, as opposed to just buying a really good piece of armour with £2.99 from their parent’s credit card.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

However, the reward is non-monetary in nature. For that reason, the Government’s view—if I can test your patience momentarily, Sir Roger, as we are straying somewhat outside this particular debate—is that loot boxes will not be covered by the gambling review, because we do not see them as gambling. However, we do see them as an issue that needs to be addressed, and that will happen via the online advertising programme, which will be overseen by the Minister for Media, Data and Digital Infrastructure, my hon. Friend the Member for Hornchurch and Upminster (Julia Lopez). That will happen shortly and advertising legislation will follow, so loot boxes will be addressed in the online advertising programme and the subsequent legislation.

The other question raised by the hon. Member for Aberdeen North was about the definition of “an appreciable number”. I have a couple of points to make. By definition, anything that is illegal is covered already in schedule 7 or through clause 52(4)(d), which we have mentioned a few times. Content that is

“primary priority content that is harmful to children”

or

“priority content that is harmful to children”

is covered in clause 53(4)(a) and (b), so we are now left with the residue of stuff that is neither illegal nor primary priority content; it is anything left over that might be harmful. By definition, we have excluded all the serious harms already, because they would be either illegal or in the priority categories. We are left with the other stuff. The reason for the qualifier “appreciable” is to make sure that we are dealing only with the residual non-priority harmful matters. We are just making sure that the duty is reasonable. What constitutes “appreciable” will ultimately get set out through Ofcom guidance, but if it was a tiny handful of users and it was not a priority harm, and was therefore not considered by Parliament to be of the utmost priority, it would be unlikely to be applicable to such a very small number. Because it is just the residual category, that is a proportionate and reasonable approach to take.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Given the Government’s ability to designate priority content and primary priority content through secondary legislation, the Minister is telling me that if they decided that loot boxes were not adequately covered by the future legislation coming through, and they were to discover that something like this was a big issue, they could add that to one of the two priority content designations.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The hon. Member is asking me a somewhat technical question, and I hesitate to answer without taking full advice, but I think the answer is yes. The reason that loot boxes are not considered gambling in our view is that they do not have a monetary value, so the exclusion in clause 53(5)(b)(i) does not apply. On a quick off-the-cuff reading, it does not strike me immediately that the exclusions in (5)(b)(ii) or (iii) would apply to loot boxes either, so I believe—and officials who know more about this than I do are nodding—that the hon. Lady is right to say that it would be possible for loot boxes to become primary priority content or priority content by way of a statutory instrument. Yes, my belief is that that would be possible.

Question put and agreed to.

Clause 53 accordingly ordered to stand part of the Bill.

Clause 54

“Content that is harmful to children” etc

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree with my hon. Friend. As parliamentarians we are seen as experts in an array of fields. I do not purport to be an expert in all things, as it is more a jack of all trades role, and it would be impossible for one Secretary of State to be an expert in everything from animal abuse to online scam ads, from fraud to CSAM and terrorism. That is why it is fundamental that the Secretary of State consults with experts and stakeholders in those fields, for whom these things are their bread and butter—their day job every day. I hope the Minister can see that regulation of the online space is a huge task to take on for us all. It is Labour’s view that any Secretary of State would benefit from the input of experts in specific fields. I urge him to support the amendment, especially given the wider concerns we have about transparency and power sharing in the Bill.

It is welcome that clause 56 will force Ofcom, as the regulator, to carry out important reviews that will assess the extent to which content is harmful to children and adults when broadly appearing on user-to-user services. As we have repeatedly said, transparency must be at the heart of our approach. While Labour does not formally oppose the clause, we have concerns about subsection (5), which states:

“The reports must be published not more than three years apart.”

The Minister knows that the Bill has been long awaited, and we need to see real, meaningful change and updates now. Will he tell us why it contains a three-year provision?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I thank the Minister for his clarification earlier and his explanation of how the categories of primary priority content and priority content can be updated. That was helpful.

Amendment 62 is excellent, and I am more than happy to support it.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I have a short comment on clause 56, which is an important clause because it will provide an analysis of how the legislation is working, and that is what Members want to see. To the point that the hon. Member for Pontypridd set out, it is right that Ofcom probably will not report until 2026, given the timeframe for the Bill being enacted. I would not necessarily want Ofcom to report sooner, because system changes take a long time to bed in. It does pose the question, however, of how Parliament will be able to analyse whether the legislation or its approach need to change between now and 2026. That reiterates the need—which I and other hon. Members have pointed out—for some sort of standing committee to scrutinise the issues. I do not personally think it would be right to get Ofcom to report earlier, because it might be an incomplete report.

--- Later in debate ---
Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I rise to welcome clause 57. It is an important part of the Bill and shows the Government acknowledging that anonymity can have a significant impact on the harms that affect victims. There is a catalogue of evidence of the harm done by those posting anonymously. Anonymity appears to encourage abusive behaviour, and there is evidence dating back to 2015 showing that anonymous accounts are more likely to share sexist comments and that online harassment victims are often not able to identify their perpetrators because of the way anonymity works online. The Government are doing an important thing here and I applaud them.

I underline that again by saying that recent research from Compassion in Politics showed that more than one in four people were put off posting on social media because of the fear of abuse, particularly from anonymous posters. Far from the status quo promoting freedom of speech, it actually deters freedom of speech, as we have said in other debates, and it particularly affects women. The Government are to be applauded for this measure.

In the work I was doing with the FA and the Premier League around this very issue, I particularly supported their call for a twin-track approach to verified accounts that said that they should be the default and that people should automatically be able to opt out of receiving posts from unverified accounts. The Bill does not go as far as that, and I can understand the Government’s reasons, but I gently point out that 81% of the people who took part in the Compassion in Politics research would willingly provide identification to get a verified account if it reduced unverified posts. They felt that was important. Some 72% supported the idea if it reduced the amount of anonymous posting.

I am touching on clause 58, but I will not repeat myself when we debate that clause. I hope that it will be possible in the code of practice for Ofcom to point out the clear benefits of having verified accounts by default and perhaps urge responsible providers to do the responsible thing and allow their users to automatically filter out unverified accounts. That is what users want, and it is extraordinary that large consumer organisations do not seem to want to give consumers what they want. Perhaps Ofcom can help those organisations understand what their consumers want, certainly in Britain.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The right hon. Lady’s speech inspired me to stand up and mention a couple of things. My first question is about using empowerment around this clause. The clause applies only to adults. I can understand the issues that there may be with verifying the identity of children, but if that means that children are unable to block unverified accounts because they cannot verify their own account, the internet becomes a less safe place for children than for adults in this context, which concerns me.

To be honest, I do not know how children’s identities could be verified, but giving them access to the filters that would allow them to block unverified accounts, whether or not they are able to verify themselves—because they are children and therefore may not have the identity documentation they need—would be very helpful.

I appreciate the points that the right hon. Member was making, and I completely agree with her on the requirement for user verification, but I have to say that I believe there is a place for anonymity on the internet. I can understand why, for a number of people, that is the only way that they can safely access some of the community support that they need.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Just for clarity, the twin-track approach does not outlaw anonymity. It just means that people have verified accounts by default; they do not have to opt into it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate that clarification. I just wanted to make it absolutely clear that I strongly believe that anonymity is a very good protection, not just for people who intend to do bad on the internet, but for people who are seeking out community, particularly. I think that that is important.

If you will allow me to say a couple of things about the next clause, Sir Roger, Mencap raised the issue of vulnerable users, specifically vulnerable adult users, in relation to the form of identity verification. If the Minister or Ofcom could give consideration to perhaps including travel passes or adult passes, it might make the internet a much easier place to navigate for people who do not have control of their own documentation—they may not have access to their passport, birth certificate, or any of that sort of thing—but who would be able to provide a travel pass, because that is within their ownership.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

We have heard quite a lot about the merits of clause 57, and I am grateful to colleagues on both side for pointing those out. The hon. Member for Pontypridd asked about the effectiveness of the user identity verification processes and how those might occur—whether they would be done individually by each company for their own users, or whether a whole industry would develop even further, with third parties providing verification that could then be used across a whole number of companies.

Some of those processes exist already in relation to age verification, and I think that some companies are already active in this area. I do not think that it would be appropriate for us, in Parliament, to specify those sorts of details. It is ultimately for Ofcom to issue that guidance under clause 58, and it is, in a sense, up to the market and to users to develop their own preferences. If individual users prefer to verify their identity once and then have that used across multiple platforms, that will itself drive the market. I think that there is every possibility that that will happen. [Interruption.]

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The additional regulations created by the Secretary of State in connection with the reports will have a lot resting on them. It is vital that they receive the appropriate scrutiny when the time comes. For example, the regulations must ensure that referrals to the National Crime Agency made by companies are of a high quality, and that requirements are easy to comply with. Prioritising the highest risk cases will be important, particularly where there is an immediate threat to the safety and welfare of a child.

Clause 60 sets out that the Secretary of State’s regulations must include

“provision about cases of particular urgency”.

Does the Minister have an idea what that will look like? What plans are in place to ensure that law enforcement can prioritise the highest risk and harm cases?

Under the new arrangements, the National Crime Agency as the designated body, the Internet Watch Foundation as the appropriate authority for notice and takedown in the UK, and Ofcom as the regulator for online harms will all hold a vast amount of information on the scale of the threat posed by child sexual exploitation and illegal content. How will the introduction of mandatory reporting assist those three organisations in improving their understanding of how harm manifests online? How does the Minister envisage the organisations working together to share information to better protect children online?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am glad that clause 60 will be in the Bill and that there will be a duty to report to the NCA. On subsection (3), though, I would like the Minister to clarify that if the Secretary of State believes that the Scottish Ministers would be appropriate people to consult, they would consult them, and the same for the Northern Ireland Executive.

I would appreciate the Minister explaining how clause 61 will work in a Scottish context, because that clause talks about the Crime and Courts Act 2013. Does a discussion need to be had with Scottish Ministers, and perhaps Northern Ireland Ministers as well, to ensure that information sharing takes place seamlessly with devolved areas with their own legal systems, to the same level as within England and Wales? If the Minister does not have an answer today, which I understand that he may not in detail, I am happy to hear from him later; I understand that it is quite a technical question.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Clause 62 creates an offence, as we discussed earlier, of knowingly or recklessly providing inaccurate information to the NCA in relation to CSEA reporting, the penalty for which is imprisonment, a fine or both. Where a company seeks to evade its responsibility, or disregards the importance of the requirement to report CSEA by providing inaccurate information, it will be liable for prosecution. We are backing the requirement to report CSEA with significant criminal powers.

Clause 63 provides definitions for the terms used in chapter 2 of part 4, in relation to the requirement to report CSEA. In summary, a UK provider of a regulated service is defined as a provider that is

“incorporated or formed under the law of any part of the United Kingdom”

or where it is

“individuals who are habitually resident in the United Kingdom”.

The shadow Minister asked about the test and what counts, and I hope that provides the answer. We are defining CSEA content as content that a company becomes aware of containing CSEA. A company can become aware of that by any means, including through the use of automated systems and processes, human moderation or user reporting.

With regard to the definition of UK-linked CSEA, which the shadow Minister also asked about, that refers to content that may have been published and shared in the UK, or where the nationality or location of a suspected offender or victim is in the UK. The definition of what counts as a UK link is quite wide, because it includes not only the location of the offender or victim but where the content is shared. That is a wide definition.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I have a specific question—the Minister answered a similar question from me earlier. The Bill says that the location of the child “is” in the UK. Would it be reasonable to expect that if a company suspected the child “was” in the UK, although not currently, that would be in scope as something required to be reported? I know that is technical, but if the “was” is included in the “is” then that is much wider and more helpful than just including the current location.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

If the child had been in the UK when the offence was committed, that would ordinarily be subject to UK criminal law, because the crime would have been committed in the UK. The test is: where was the child or victim at the time the offence was committed? As I said a moment ago, however, the definition of “UK-linked” is particularly wide and includes

“the place where the content was published, generated, uploaded or shared.”

The word “generated”—I am reading from clause 63(6)(a), at the top of page 56—is clearly in the past tense and would include the circumstance that the hon. Lady described.

--- Later in debate ---
Schedule 7 is an important schedule, which outlines the providers of internet services that are not subject to the duties on regulated provider pornographic content. Those are important exemptions that Labour welcomes being clarified in the Bill. For that reason, we have tabled no amendments at present.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I associate myself with the comments made by the hon. Member for Pontypridd and apologise on behalf of my hon. Friend the Member for Ochil and South Perthshire, who is currently in the Chamber dealing with the Channel 4 privatisation. I am sure that, given his position on the Joint Committee, he would have liked to comment on the clause and would have welcomed its inclusion in the Bill, but, unfortunately, he cannot currently do so.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

It is a great shame that the hon. Member for Ochil and South Perthshire is occupied in the main Chamber, because I could have pointed to this change as one of the examples of the Government listening to the Joint Committee, on which he and many others served. However, I hope that the hon. Member for Aberdeen North will communicate my observation to him, which I am sure he will appreciate.

In seriousness, this is an example of the Government moving the Bill on in response to widespread parliamentary and public commentary. It is right that we extend the duties to cover commercial pornographic content as well as the user-to-user pornography covered previously. I thank the Opposition parties for their support for the inclusion of those measures.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Clause 68 outlines the duties covering regulated provider pornographic content, and Ofcom’s guidance on those duties. Put simply, the amendments are about age verification and consent, to protect women and children who are victims of commercial sexual exploitation.

I am moving a series of targeted amendments, tabled by my right hon. Friend the Member for Kingston upon Hull North (Dame Diana Johnson), which I hope that all hon. Members will be able to support because this is an issue that goes beyond party lines. This is about children who have been sexually abused, women who have been raped, and trafficking victims who have been exploited, who have all suffered the horror of filmed footage of their abuse being published on some of the world’s biggest pornography websites. This is about basic humanity.

Currently, leading pornography websites allow members of the public to upload pornographic videos without verifying that everyone in the film is an adult, that they gave their permission for it to be uploaded to a pornography website, or even that they know the film exists. It is sadly not surprising that because of the absence of even the most basic safety measures, hugely popular and profitable pornography websites have been found hosting and profiting from filmed footage of rape, sex trafficking, image-based sexual abuse and child sexual abuse. This atrocious practice is ongoing and well documented.

In 2019, PayPal stopped processing payments for Pornhub—one of the most popular pornography websites in the world—after an investigation by The Sunday Times revealed that the site contained child abuse videos and other illegal content. That included an account on the site dedicated to posting so-called creepshots of UK schoolgirls. In 2020, The New York Times documented the presence of child abuse videos on Pornhub, prompting Mastercard, Visa and Discover to block the use of their cards for purchases on the site.

New York Times reporter Nicholas Kristof wrote of Pornhub:

“Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.”

That particular pornography website is now subject to multiple lawsuits launched against its parent company, MindGeek, by victims whose abuse was published on the site. Plaintiffs include victims of image-based sexual abuse in the UK, such as Crystal Palace footballer Leigh Nicol. Her phone was hacked, and private content was uploaded to Pornhub without her knowledge. She bravely and generously shared her experience in an interview for Sky Sports News, saying:

“The damage is done for me so this is about the next generation. I feel like prevention is better than someone having to react to this. I cannot change it alone but if I can raise awareness to stop it happening to others then that is what I want to do… The more that you dig into this, the more traumatising it is because there are 14-year-old kids on these websites and they don’t even know about it. The fact that you can publish videos that have neither party’s consent is something that has to be changed by law, for sure.”

I agree. It is grotesque that pornography website operators do not even bother to verify that everyone featured in films on their sites is an adult or even gave permission for the film to be uploaded. That cannot be allowed to continue.

These amendments, which I hope will receive the cross-party backing that they strongly deserve, would stop pornography websites publishing and profiting from videos of rape and child sexual abuse by requiring them to implement the most basic of prevention measures.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I support the hon. Member’s amendments. The cases that she mentions hammer home the need for women and girls to be mentioned in the Bill. I do not understand how the Government can justify not doing so when she is absolutely laying out the case for doing so.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I agree with the hon. Member and welcome her intervention. We will be discussing these issues time and again during our proceedings. What is becoming even more apparent is the need to include women and girls in the Bill, call out violence against women and girls online for what it is, and demand that the Government go further to protect women and girls. This is yet another example of where action needs to happen. I hope the Minister is hearing our pleas and that this will happen at some point as we make progress through the Bill.

More needs to be done to tackle this problem. Pornography websites need to verify that every individual in pornographic videos published on their site is an adult and gave their permission for the video to be published, and enable individuals to withdraw their consent for pornography of them to remain on the site. These are rock-bottom safety measures for preventing the most appalling abuses on pornography websites.