Online Safety Bill Debate
Full Debate: Read Full DebateDiana Johnson
Main Page: Diana Johnson (Labour - Kingston upon Hull North and Cottingham)Department Debates - View all Diana Johnson's debates with the Department for Digital, Culture, Media & Sport
(2 years, 3 months ago)
Commons ChamberI will answer the question about Standing Order No. 24 first, because I can deal with it immediately: clearly, if an application is made, Mr Speaker will determine it himself.
The principles concerning motions of no confidence are set out at paragraph 18.44 of “Erskine May”, which also gives examples of motions that have been debated and those that have not. “May” says:
“By established convention, the Government always accedes to the demand from the Leader of the Opposition to allot a day for the discussion of a motion tabled by the official Opposition which, in the Government’s view, would have the effect of testing the confidence of the House.”
I can only conclude, therefore, that the Government have concluded that the motion, as tabled by the official Opposition, does not have that effect. That is a matter for the Government, though, rather than for the Chair.
May I say that there are seven more sitting days before recess? As Deputy Speaker, I would anticipate that there will be further discussions.
We now have to move on with the continuation of business on the Bill.
New Clause 7
Duties regarding user-generated pornographic content: regulated services
“(1) This section sets out the duties which apply to regulated services in relation to user-generated pornographic content.
(2) A duty to verify that each individual featuring in the pornographic content has given their permission for the content in which they feature to be published or made available by the service.
(3) A duty to remove pornographic content featuring a particular individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.
(4) For the meaning of ‘pornographic content’, see section 66(2).
(5) In this section, ‘user-generated pornographic content’ means any content falling within the meaning given by subsection (4) and which is also generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.
(6) For the meaning of ‘regulated service’, see section 2(4).”—(Dame Diana Johnson.)
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
New clause 33—Meaning of “pornographic content”—
“(1) In this Act ‘pornographic content’ means any of the following—
(a) a video work in respect of which the video works authority has issued an R18 certificate;
(b) content that was included in a video work to which paragraph (a) applies, if it is reasonable to assume from its nature that its inclusion was among the reasons why the certificate was an R18 certificate;
(c) any other content if it is reasonable to assume from its nature that any classification certificate issued in respect of a video work including it would be an R18 certificate;
(d) a video work in respect of which the video works authority has issued an 18 certificate, and that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal;
(e) content that was included in a video work to which paragraph (d) applies, if it is reasonable to assume from the nature of the content—
(i) that it was produced solely or principally for the purposes of sexual arousal, and
(ii) that its inclusion was among the reasons why the certificate was an 18 certificate;
(f) any other content if it is reasonable to assume from its nature—
(i) that it was produced solely or principally for the purposes of sexual arousal, and
(ii) that any classification certificate issued in respect of a video work including it would be an 18 certificate;
(g) a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if—
(i) it includes content that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal, and
(ii) it is reasonable to assume from the nature of that content that its inclusion was among the reasons why the video works authority made that determination;
(h) content that was included in a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if it is reasonable to assume from the nature of the content—
(i) that it was produced solely or principally for the purposes of sexual arousal, and
(ii) that its inclusion was among the reasons why the video works authority made that determination;
(i) any other content if it is reasonable to assume from the nature of the content—
(i) that it was produced solely or principally for the purposes of sexual arousal, and
(ii) that the video works authority would determine that a video work including it was not suitable for a classification certificate to be issued in respect of it.
(2) In this section—
‘18 certificate’ means a classification certificate which—
(a) contains, pursuant to section 7(2)(b) of the Video Recordings Act 1984, a statement that the video work is suitable for viewing only by persons who have attained the age of 18 and that no video recording containing that work is to be supplied to any person who has not attained that age, and
(b) does not contain the statement mentioned in section 7(2)(c) of that Act that no video recording containing the video work is to be supplied other than in a licensed sex shop;
‘classification certificate’ has the same meaning as in the Video Recordings Act 1984 (see section 7 of that Act);
‘content’ means—
(a) a series of visual images shown as a moving picture, with or without sound;
(b) a still image or series of still images, with or without sound; or
(c) sound;
‘R18 certificate’ means a classification certificate which contains the statement mentioned in section 7(2)(c) of the Video Recordings Act 1984 that no video recording containing the video work is to be supplied other than in a licensed sex shop;
‘the video works authority’ means the person or persons designated under section 4(1) of the Video Recordings Act 1984 as the authority responsible for making arrangements in respect of video works other than video games;
‘video work’ means a video work within the meaning of the Video Recordings Act 1984, other than a video game within the meaning of that Act.”
This new clause defines pornographic content for the purposes of the Act and would apply to user-to-user services and commercial pornographic content.
Amendment 205, in clause 34, page 33, line 23, at end insert—
“(3A) But an advertisement shall not be regarded as regulated user-generated content and precluded from being a ‘fraudulent advertisement’ by reason of the content constituting the advertisement being generated directly on, uploaded to, or shared on a user-to-user service before being modified to a paid-for advertisement.”
Amendment 206, page 33, line 30, after “has” insert
“or may reasonably be expected to have”.
Amendment 207, in clause 36, page 35, line 12, at end insert—
“(3A) An offence under section 993 of the Companies Act 2006 (fraudulent trading).”
Amendment 208, page 35, line 18, after “(3)” insert “, 3(A)”.
Amendment 209, page 35, line 20, after “(3)” insert “, 3(A)”
Amendment 210, page 35, line 23, after “(3)” insert “, 3(A)”
Amendment 201, in clause 66, page 59, line 8, leave out from “Pornographic content” to end of line 10 and insert
“has the same meaning as section [meaning of pornographic content]”.
This amendment defines pornographic content for the purposes of the Part 5. It is consequential on NC33.
Amendment 56, page 59, line 8, after “content” insert “, taken as a whole,”
This amendment would require that content is considered as a whole before being defined as pornographic content.
Amendment 33, in clause 68, page 60, line 33, at end insert—
“(2A) A duty to verify that every individual featured in regulated provider pornographic content is an adult before the content is published on the service.
(2B) A duty to verify that every individual featured in regulated provider pornographic content that is already published on the service when this Act is passed is an adult and, where that is not the case, remove such content from the service.
(2C) A duty to verify that each individual appearing in regulated provider pornographic content has given their permission for the content in which they appear to be published or made available by the internet service.
(2D) A duty to remove regulated provider pornographic content featuring an individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.”
This amendment creates a duty to verify that each individual featured in pornographic content is an adult and has agreed to the content being uploaded before it is published. It would also impose a duty to remove content if the individual withdraws consent at any time.
Amendment 34, page 60, line 37, leave out “subsection (2)” and insert “subsections (2) to (2D)”.
This amendment is consequential on Amendment 33.
Amendment 31, in clause 182, page 147, line 16, leave out from “unless” to end of line 17 and insert—
“(a) a draft of the instrument has been laid before each House of Parliament,
“(b) the Secretary of State has made a motion in the House of Commons in relation to the draft instrument, and
(c) the draft instrument has been approved by a resolution of each House of Parliament.”
This amendment would require a draft of a statutory instrument containing regulations under sections 53 or 54 to be debated on the floor of the House of Commons, rather than in a delegated legislation committee (as part of the affirmative procedure).
Amendment 158, in clause 192, page 155, line 26, after “including” insert “but not limited to”.
This amendment clarifies that the list of types of content in clause 192 is not exhaustive.
May I welcome the Minister to his place, as I did not get an opportunity to speak on the previous group of amendments?
New clause 7 and amendments 33 and 34 would require online platforms to verify the age and consent of all individuals featured in pornographic videos uploaded to their site, as well as enabling individuals to withdraw their consent to the footage remaining on the website. Why are the amendments necessary? Let me read a quotation from a young woman:
“I sent Pornhub begging emails. I pleaded with them. I wrote, ‘Please, I’m a minor, this was assault, please take it down.’”
She received no reply and the videos remained live. That is from a BBC article entitled “I was raped at 14, and the video ended up on a porn site”.
This was no one-off. Some of the world’s biggest pornography websites allow members of the public to upload videos without verifying that everyone in the film is an adult or that everyone in the film gave their permission for it to be uploaded. As a result, leading pornography websites have been found to be hosting and profiting from filmed footage of rape, sex trafficking, image-based sexual abuse and child sexual abuse.
In 2020, The New York Times documented the presence of child abuse videos on Pornhub, one of the most popular pornography websites in the world, prompting Mastercard, Visa and Discover to block the use of their cards for purchases on the site. The New York Times reporter Nicholas Kristof wrote about Pornhub:
“Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.”
Even before that, in 2019, PayPal took the decision to stop processing payments for Pornhub after an investigation by The Sunday Times revealed that the site contained child abuse videos and other illegal content. The newspaper reported:
“Pornhub is awash with secretly filmed ‘creepshots’ of schoolgirls and clips of men performing sex acts in front of teenagers on buses. It has also hosted indecent images of children as young as three.
The website says it bans content showing under-18s and removes it swiftly. But some of the videos identified by this newspaper’s investigation had 350,000 views and had been on the platform for more than three years.”
One of the women who is now being forced to take legal action against Pornhub’s parent company, MindGeek, is Crystal Palace footballer Leigh Nicol. Leigh’s phone was hacked and private content was uploaded to Pornhub without her knowledge. She said in an interview:
“The damage is done for me so this is about the next generation. I feel like prevention is better than someone having to react to this. I cannot change it alone but if I can raise awareness to stop it happening to others then that is what I want to do…The more that you dig into this, the more traumatising it is because there are 14-year-old kids on these websites and they don’t even know about it. The fact that you can publish videos that have neither party’s consent is something that has to be changed by law, for sure.”
Leigh Nicol is spot on.
Unfortunately, when this subject was debated in Committee, the previous Minister, the hon. Member for Croydon South (Chris Philp), argued that the content I have described—including child sexual abuse images and videos—was already illegal, and there was therefore no need for the Government to introduce further measures. However, that misses the point: the Minister was arguing against the very basis of his own Government’s Bill. At the core of the Bill, as I understand it, is a legal duty placed on online platforms to combat and remove content that is already illegal, such as material relating to terrorism. ln keeping with that, my amendments would place a legal duty on online platforms hosting pornographic content to combat and remove illegal content through the specific and targeted measure of verifying the age and consent of every individual featured in pornographic content on their sites. The owners and operators of pornography websites are getting very rich from hosting footage of rape, trafficking and child sexual abuse, and they must be held to account under the law and required to take preventive action.
The Organisation for Security and Co-operation in Europe, which leads action to combat human trafficking across 57 member states, recommends that Governments require age and consent verification on pornography websites in order to combat exploitation. The OSCE told me:
“These sites routinely feature sexual violence, exploitation and abuse, and trafficking victims. Repeatedly these sites have chosen profits over reasonable prevention and protection measures. At the most basic level, these sites should be required to ensure that each person depicted is a consenting adult, with robust age verification and the right to withdraw consent at any time. Since self- regulation hasn’t worked, this will only work through strong, state-led regulation”.
Who else supports that? Legislation requiring online platforms to verify the age and consent of all individuals featured in pornographic content on their sites is backed by leading anti-sexual exploitation organisations including CEASE—the Centre to End All Sexual Exploitation—UK Feminista and the Traffickinghub movement, which has driven the global campaign to expose the abuses committed by, in particular, Pornhub.
New clause 7 and amendments 33 and 34 are minimum safety measures that would stop the well-documented practice of pornography websites hosting and profiting from videos of rape, trafficking and child sexual abuse. I urge the Government to reconsider their position, and I will seek to test the will of the House on new clause 7 later this evening.
I echo the concerns expressed by the right hon. Member for Kingston upon Hull North (Dame Diana Johnson). Some appalling abuses are taking place online, and I hope that the Bill goes some way to address them, to the extent that that is possible within the framework that it sets up. I greatly appreciate the right hon. Lady’s comments and her contribution to the debate.
I have a tight and narrow point for the Minister. In amendment 56, I seek to ensure that only pornographic material is caught by the definition in the Bill. My concern is that we catch these abuses online, catch them quickly and penalise them harshly, but also that sites that may display, for example, works of art featuring nudes—or body positivity community sites, of which there are several—are not inadvertently caught in our desire to clamp down on illegal pornographic sites. Perhaps the Minister will say a few words about that in his closing remarks.
I was not planning to speak, but we have a couple of minutes so I will abuse that position.
I just want to say that I do not want new clause 7 to be lost in this debate and become part of the flotsam and jetsam of the tide of opinion that goes back and forth in this place, because new clause 7 is about consent. We are trying very hard to teach young men all about consent, and if we cannot do it from this place, then when can we do it? We can work out the details of the technology in time, as we always do. It is out there. Other people are way ahead of us in this matter. In fact, the people who produce this pornography are way ahead of us in this matter.
While we have been having this debate, Iain Corby, executive director at the Age Verification Providers Association, has sent me an email in which he said that the House may be interested to know that one of the members of that organisation offers adult sites a service that facilitates age verification and the obtaining and maintaining of records of consent. So it is possible to do this if the will is there.
I absolutely agree. We can also look at this from the point of view of gambling reform and age verification for that. The technology is there, and we can harness and use it to protect people. All I am asking is that we do not let this slip through the cracks this evening.
It is because of the danger of such a sentiment that this Bill is so important. It not just sets the targets and requirements of companies to act against illegal content, but enables a regulator to ensure that they have the systems and processes in place to do it, that they are using appropriate technology and that they apply the principle that their system should be effective at addressing this issue. If they are defective, that is a failure on the company’s part. It cannot be good enough that the company says, “It is too difficult to do”, when they are not using technologies that would readily solve that problem. We believe that the technologies that the companies have and the powers of the regulator to have proper codes of practice in place and to order the companies to make sure they are doing it will be sufficient to address the concern that the hon. Lady raises.
I am a little taken aback that the Minister believes that the legislation will be sufficient. I do not understand why he has not responded to the point that my hon. Friend the Member for Birmingham, Yardley (Jess Phillips) was making that we could make this happen by putting the proposal in the Bill and saying, “This is a requirement.” I am not sure why he thinks that is not the best way forward.
It is because the proposal would not make such content more illegal than it is now. It is already illegal and there are already legal duties on companies to act. The regulator’s job is to ensure they have the systems in place to do that effectively, and that is what the Bill sets out. We believe that the Bill addresses the serious issue that the right hon. Lady raises in her amendments. That legal requirement is there, as is the ability to have the systems in place.
If I may, I will give a different example based on the fraud example given by the shadow Minister, the hon. Member for Worsley and Eccles South (Barbara Keeley). On the Joint Committee that scrutinised the Bill, we pushed hard to have fraudulent ads included within the scope of the Bill, which has been one of the important amendments to it. The regulator can consider what systems the company should have in place to identify fraud, but also what technologies it employs to make it far less likely that fraud would be there in the first place. Google has a deal with the Financial Conduct Authority, whereby it limits advertisers from non-accredited companies advertising on its platform. That makes it far less likely that fraud will be discovered because, if the system works, only properly recognised organisations will be advertising.
Facebook does not have such a system in place. As a consequence, since the Google system went live, we have seen a dramatic drop in fraud ads on Google, but a substantial increase in fraud ads on Facebook and platforms such as Instagram. That shows that if we have the right systems in place, we can have a better outcome and change the result. The job of the regulator with illegal pornography and other illegal content should be to look at those systems and say, “Do the companies have the right technology to deliver the result that is required?” If they do not, that would still be a failure of the codes.