Lindsay Hoyle Portrait Mr Speaker
- Hansard - -

With this it will be convenient to discuss the following:

New clause 2—Secretary of State’s powers to suggest modifications to a code of practice

“(1) The Secretary of State may on receipt of a code write within one month of that day to OFCOM with reasoned, evidence-based suggestions for modifying the code.

(2) OFCOM shall have due regard to the Secretary of State’s letter and must reply to the Secretary of State within one month of receipt.

(3) The Secretary of State may only write to OFCOM twice under this section for each code.

(4) The Secretary of State and OFCOM shall publish their letters as soon as reasonably possible after transmission, having made any reasonable redactions for public safety and national security.

(5) If the draft of a code of practice contains modifications made following changes arising from correspondence under this section, the affirmative procedure applies.”

New clause 3—Priority illegal content: violence against women and girls

“(1) For the purposes of this Act, any provision applied to priority illegal content should also be applied to any content which—

(a) constitutes,

(b) encourages, or

(c) promotes

(2) ‘Violence against women and girls’ is defined by Article 3 of the Council of Europe Convention on Preventing Violence Against Women and Domestic Violence (‘the Istanbul Convention’).”

This new clause applies provisions to priority illegal content to content which constitutes, encourages or promotes violence against women and girls.

New clause 4—Duty about content advertising or facilitating prostitution: Category 1 and Category 2B services

“(1) A provider of a Category 1 or Category 2B service must operate the service so as to—

(a) prevent individuals from encountering content that advertises or facilitates prostitution;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.

(2) A provider of a Category 1 or Category 2B service must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) If a person is the provider of more than one Category 1 or Category 2B service, the duties set out in this section apply in relation to each such service.

(4) The duties set out in this section extend only to the design, operation and use of a Category 1 or Category 2B service in the United Kingdom.

(5) For the meaning of ‘Category 1 service’ and ‘Category 2B service’, see section 81 (register of categories of services).

(6) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 5—Duty about content advertising or facilitating prostitution: Category 2A services

“(1) A provider of a Category 2A service must operate that service so as to minimise the risk of individuals encountering content which advertises or facilitates prostitution in or via search results of the service.

(2) A provider of a Category 2A service must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) The reference to encountering content which advertises or facilitates prostitution “in or via search results” of a search service does not include a reference to encountering such content as a result of any subsequent interactions with an internet service other than the search service.

(4) If a person is the provider of more than one Category 2A service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extend only to the design, operation and use of a Category 2A service in the United Kingdom.

(6) For the meaning of ‘Category 2A service’, see section 81 (register of categories of services).

(7) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 6—Duty about content advertising or facilitating prostitution: internet services providing pornographic content

“(1) A provider of an internet service within the scope of section 67 of this Act must operate that service so as to—

(a) prevent individuals from encountering content that advertises or facilitates prostitution;

(b) minimise the length of time for which any such content is present;

(c) where the provider is alerted by a person to the presence of such content, or becomes aware of it in any other way, swiftly take down such content.

(2) A provider of an internet service under this section must include clear and accessible provisions in a publicly available statement giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) If a person is the provider of more than one internet service under this section, the duties set out in this section apply in relation to each such service.

(4) For the meaning of ‘prostitution’, see section 54 of the Sexual Offences Act 2003.”

New clause 8—Duties about advertisements for cosmetic procedures

“(1) A provider of a regulated service must operate the service using systems and processes designed to—

(a) prevent individuals from encountering advertisements for cosmetic procedures that do not meet the conditions specified in subsection (3);

(b) minimise the length of time for which any such advertisement is present;

(c) where the provider is alerted by a person to the presence of such an advertisement, or becomes aware of it in any other way, swiftly take it down.

(2) A provider of a regulated service must include clear and accessible provisions in the terms of service giving information about any proactive technology used by the service for the purpose of compliance with the duty set out in subsection (1) (including the kind of technology, when it is used, and how it works).

(3) The conditions under subsection (1)(a) are that the advertisement—

(a) contains a disclaimer as to the health risks of the cosmetic procedure, and

(b) includes a certified service quality indicator.

(4) If a person is the provider or more than one regulated service, the duties set out in this section apply in relation to each such service.

(5) The duties set out in this section extent only to the design, operation and use of a regulated service in the United Kingdom.

(6) For the meaning of ‘regulated service’, see section 3 (‘Regulated service’. ‘Part 3 service’ etc).”

This new clause would place a duty on all internet service providers regulated by the Bill to prevent individuals from encountering adverts for cosmetic procedures that do not contain a disclaimer as to the health risks of the procedure nor include a certified service quality indicator.

New clause 9—Content harmful to adults risk assessment duties: regulated search services

“(1) This section sets out the duties about risk assessments which apply in relation to all regulated search services.

(2) A duty to carry out a suitable and sufficient priority adults risk assessment at a time set out in, or as provided by Schedule 3.

(3) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adult risk assessment relating to the impacts of that proposed change.

(5) An ‘adults risk assessment’ of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the level of risk of individuals who are users of the service encountering each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) risks presented by algorithms used by the service, and the way that the service indexes, organises and presents search results;

(b) the level of risk of functionalities of the service facilitating individuals encountering search content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(c) the nature, and severity, of the harm that might be suffered by individuals from the matters identified in accordance with paragraphs (a) and (b);

(d) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(6) In this section, references to risk profiles are to the risk profiles for the time being published under section 84 which relate to the risk of harm to adults presented by priority content that is harmful to adults.

(7) See also—section 20(2) (records of risk assessments), and Schedule 3 (timing of providers’ assessments).”

New clause 10—Safety Duties Protecting Adults: regulated search services

“(1) This section sets out the duties about protecting adults which apply in relation to all regulated search services.

(2) A duty to summarise in the policies of the search service the findings of the most recent adults’ risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults).

(3) A duty to include provisions in the search service policies specifying, in relation to each kind of priority content that is harmful to adults that is to be treated in a way described in subsection (4), which of those kinds of treatment is to be applied.

(4) The duties set out in subsections (2) and (3) apply across all areas of a service, including the way the search engine is operated and used as well as search content of the service, and (among other things) require the provider of a service to take or use measures in the following areas, if it is proportionate to do so—

(a) regulatory compliance and risk management arrangements,

(b) design of functionalities, algorithms and other features relating to the search engine,

(c) functionalities allowing users to control the content they encounter in search results,

(d) content prioritisation and ranking,

(e) user support measures, and

(f) staff policies and practices.

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults (as identified in the most recent adults’ risk assessment of the service), by reference to—

(a) any provisions of the policies included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the policies in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently in relation to content which the provider reasonably considers is priority

(NaN) If the provider of a service becomes aware of any non-designated content that is harmful to adults present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(NaN) A duty to ensure that the provisions of the publicly available statement referred to in subsections (5) and (7) are clear and accessible.

(NaN) In this section—

‘adults’ risk assessment’ has the meaning given by section 12;

‘non-designated content that is harmful to adults’ means content that is harmful to adults other than priority content that is harmful to adults.”

New clause 18—Child user empowerment duties

“(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section ‘non-verified user’ means a user who has not verified their identity to the provider of a service (see section 57(1)).

(12) In this section references to features include references to functionalities and settings.”

New clause 24—Category 1 services: duty not to discriminate, harass or victimise against service users

“(1) The following duties apply to all providers of Category 1 services.

(2) A duty not to discriminate, on the grounds of a protected characteristic, against a person wishing to use the service by not providing the service, if the result of not providing the service is to cause harm to that person.

(3) A duty not to discriminate, on the grounds of a protected characteristic, against any user of the service in a way that causes harm to the user—

(a) as to the terms on which the provider provides the service to the user;

(b) by terminating the provision of the service to the user;

(c) by subjecting the user to any other harm.

(4) A duty not to harass, on the grounds of a protected characteristic, a user of the service in a way that causes harm to the user.

(5) A duty not to victimise because of a protected characteristic a person wishing to use the service by not providing the user with the service, if the result of not providing the service is to cause harm to that person.

(6) A duty not to victimise a service user—

(a) as to the terms on which the provider provides the service to the user;

(b) by terminating the provision of the service to the user;

(c) by subjecting the user to any other harm.

(7) In this section—

references to harassing, discriminating or victimising have the same meaning as set out in Part 2 of the Equality Act 2010;

‘protected characteristic’ means a characteristic listed in section 4 of the Equality Act 2010.”

This new clause would place a duty, regulated by Ofcom, on Category 1 service providers not to discriminate, harass or victimise users of their services on the basis of a protected characteristic if doing so would result in them being caused harm. Discrimination, harassment and victimisation, and protected characteristics, have the same meaning as in the Equality Act 2010.

New clause 25—Report on duties that apply to all internet services likely to be accessed by children

“(1) Within 12 months of this Act receiving Royal Assent, the Secretary of State must commission an independent evaluation of the matters under subsection (2) and must lay the report of the evaluation before Parliament.

(2) The evaluation under subsection (1) must consider whether the following duties should be imposed on all providers of services on the internet that are likely to be accessed by children, other than services regulated by this Act—

(a) duties similar to those imposed on regulated services by sections 10 and 25 of this Act to carry out a children’s risk assessment, and

(b) duties similar to those imposed on regulated services by sections 11 and 26 of this Act to protect children’s online safety.”

This new clause would require the Secretary of State to commission an independent evaluation on whether all providers of internet services likely to be accessed by children should be subject to child safety duties and must conduct a children’s risk assessment.

New clause 26—Safety by design

“(1) In exercising their functions under this Act—

(a) The Secretary of State, and

(b) OFCOM

must have due regard to the principles in subsections (2)-(3).

(2) The first principle is that providers of regulated services should design those services to prevent harmful content from being disseminated widely, and that this is preferable in the first instance to both—

(a) removing harmful content after it has already been disseminated widely, and

(b) restricting which users can access the service or part of it on the basis that harmful content is likely to disseminate widely on that service.

(4) The second principle is that providers of regulated services should safeguard freedom of expression and participation, including the freedom of expression and participation of children.”

This new clause requires the Secretary of State and Ofcom to have due regard to the principle that internet services should be safe by design.

New clause 27—Publication of risk assessments

“Whenever a Category 1 service carries out any risk assessment pursuant to Part 3 of this Act, the service must publish the risk assessment on the service’s website.”

New clause 38—End-to-end encryption

“Nothing in this Act shall prevent providers of user-to-user services protecting their users’ privacy through end-to-end encryption.”

Government amendment 57.

Amendment 202, in clause 6, page 5, line 11, at end insert—

“(ba) the duty about pornographic content set out in Schedule [Additional duties on pornographic content].”

This amendment ensures that user-to-user services must meet the new duties set out in NS1.

Government amendments 163, 58, 59 and 60.

Amendment 17, in clause 8, page 7, line 14, at end insert—

“(h) how the service may be used in conjunction with other regulated user-to-user services such that it may—

(i) enable users to encounter illegal content on other regulated user-to-user services, and

(ii) constitute part of a pathway to harm to individuals who are users of the service, in particular in relation to CSEA content.”

This amendment would incorporate into the duties a requirement to consider cross-platform risk.

Amendment 15, in clause 8, page 7, line 14, at end insert—

“(5A) The duties set out in this section apply in respect of content which reasonably foreseeably facilitates or aids the discovery or dissemination of CSEA content.”

This amendment extends the illegal content risk assessment duties to cover content which could be foreseen to facilitate or aid the discovery or dissemination of CSEA content.

Government amendments 61 and 62.

Amendment 18, page 7, line 30 [Clause 9], at end insert—

“(none) ‘, including by being directed while on the service towards priority illegal content hosted by a different service;’

This amendment aims to include within companies’ safety duties a duty to consider cross-platform risk.

Amendment 16, in clause 9, page 7, line 35, at end insert—

“(d) minimise the presence of content which reasonably foreseeably facilitates or aids the discovery or dissemination of priority illegal content, including CSEA content.”

This amendment brings measures to minimise content that may facilitate or aid the discovery of priority illegal content within the scope of the duty to maintain proportionate systems and processes.

Amendment 19, in clause 9, page 7, line 35, at end insert—

“(3A) A duty to collaborate with other companies to take reasonable and proportionate measures to prevent the means by which their services can be used in conjunction with other services to facilitate the encountering or dissemination of priority illegal content, including CSEA content.”

This amendment creates a duty to collaborate in cases where there is potential cross-platform risk in relation to priority illegal content and CSEA content.

Government amendments 63 to 67.

Amendment 190, page 10, line 11, in clause 11, at end insert “, and—

(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”

This amendment requires services to take or use proportionate measures to mitigate the harm to children caused by habit-forming features of a service.

Government amendments 68 and 69.

Amendment 42, page 11, line 16, in clause 11, at end insert—

“(c) the benefits of the service to children’s well-being.”

Amendment 151, page 12, line 43, leave out Clause 13.

This amendment seeks to remove Clause 13 from the Bill.

Government amendment 70.

Amendment 48, page 13, line 5, in clause 13, leave out “is to be treated” and insert

“the provider decides to treat”

This amendment would mean that providers would be free to decide how to treat content that has been designated ‘legal but harmful’ to adults.

Amendment 49, page 13, line 11, in clause 13, at end insert—

‘(ca) taking no action;”

This amendment provides that providers would be free to take no action in response to content referred to in subsection (3).

Government amendments 71 and 72.

Amendment 157, page 14, line 11, in clause 14, leave out subsections (6) and (7).

This amendment is consequential to Amendment 156, which would require all users of Category 1 services to be verified.

Government amendments 73, 164, 74 and 165.

Amendment 10, page 16, line 16, in clause 16, leave out from “or” until the end of line 17.

Government amendments 166 and 167.

Amendment 50, page 20, line 21, in clause 19, at end insert—

“(6A) A duty to include clear provision in the terms of service that the provider will not take down, or restrict access to content generated, uploaded or shared by a user save where it reasonably concludes that—

(a) the provider is required to do so pursuant to the provisions of this Act, or

(b) it is otherwise reasonable and proportionate to do so.”

This amendment sets out a duty for providers to include in terms of service a commitment not to take down or restrict access to content generated, uploaded or shared by a user except in particular circumstances.

Government amendment 168.

Amendment 51, page 20, line 37, in clause 19, at end insert—

“(10) In any claim for breach of contract brought in relation to the provisions referred to in subsection (7), where the breach is established, the court may make such award by way of compensation as it considers appropriate for the removal of, or restriction of access to, the content in question.”

This amendment means that where a claim is made for a breach of the terms of service result from Amendment 50, the court has the power to make compensation as it considers appropriate.

Government amendment 169.

Amendment 47, page 22, line 10, in clause 21, at end insert—

“(ba) the duties about adults’ risk assessment duties in section (Content harmful to adult risk assessment duties: regulated search services),

(bb) the safety duties protecting adults in section (Safety duties protecting adults: regulated search services).”

Government amendments 75 to 82.

Amendment 162, page 31, line 19, in clause 31, leave out “significant”

This amendment removes the requirement for there to be a “significant” number of child users, and replaces it with “a number” of child users.

Government amendments 85 to 87.

Amendment 192, page 36, line 31, in clause 37, at end insert—

“(ha) persons whom OFCOM consider to have expertise in matters relating to the Equality Act 2010,”

This amendment requires Ofcom to consult people with expertise on the Equality Act 2010 about codes of practice.

Amendment 44, page 37, line 25, in clause 39, leave out from beginning to the second “the” in line 26.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 45, page 38, line 8, leave out Clause 40.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 13, page 38, line 12, in clause 40, leave out paragraph (a).

Amendment 46, page 39, line 30, leave out Clause 41.

This amendment will remove the ability of the Secretary of State to block codes of practice being, as soon as practical, laid before the House for its consideration.

Amendment 14, page 39, line 33, in clause 41, leave out subsection (2).

Amendment 21, page 40, line 29, in clause 43, leave out “may require” and insert “may make representations to”

Amendment 22, page 40, line 33, in clause 43, at end insert—

‘(2A) OFCOM must have due regard to representations by the Secretary of State under subsection (2).”

Government amendments 88 to 89 and 170 to 172.

Amendment 161, page 45, line 23, in clause 49, leave out paragraph (d).

This amendment removes the exemption for one-to-one live aural communications.

Amendment 188, page 45, line 24, in clause 49, leave out paragraph (e).

This amendment removes the exemption for comments and reviews on provider content.

Government amendments 90 and 173.

Amendment 197, page 47, line 12, in clause 50, after “material” insert

“or special interest news material”.

Amendment 11, page 47, line 19, in clause 50, after “has” insert “suitable and sufficient”.

Amendment 198, page 47, line 37, in clause 50, leave out the first “is” and insert

“and special interest news material are”.

Amendment 199, page 48, line 3, in clause 50, at end insert—

““special interest news material” means material consisting of news or information about a particular pastime, hobby, trade, business, industry or profession.”

Amendment 12, page 48, line 7, in clause 50, after “a” insert “suitable and sufficient”.

Government amendments 91 to 94.

Amendment 52, page 49, line 13, in clause 52, leave out paragraph (d).

This amendment limits the list of relevant offences to those specifically specified.

Government amendments 95 to 100.

Amendment 20, page 51, line 3, in clause 54, at end insert—

‘(2A) Priority content designated under subsection (2) must include—

(a) content that contains public health related misinformation or disinformation, and

(b) misinformation or disinformation that is promulgated by a foreign state.”

This amendment would require the Secretary of State’s designation of “priority content that is harmful to adults” to include public health-related misinformation or disinformation, and misinformation or disinformation spread by a foreign state.

Amendment 53, page 51, line 47, in clause 55, after “State” insert “reasonably”.

This amendment, together with Amendment 54, would mean that the Secretary of State must reasonably consider the risk of harm to each one of an appreciable number of adults before specifying a description of the content.

Amendment 54, page 52, line 1, in clause 55, after “to” insert “each of”.

This amendment is linked to Amendment 53.

Amendment 55, page 52, line 12, in clause 55, after “OFCOM” insert

“, Parliament and members of the public in a manner the Secretary of State considers appropriate”.

This amendment requires the Secretary of State to consult Parliament and the public, as well as Ofcom, in a manner the Secretary of State considers appropriate before making regulations about harmful content.

Government amendments 147 to 149.

Amendment 43, page 177, line 23, in schedule 4, after “ages” insert

“, including the benefits of the service to their well-being,”

Amendment 196, page 180, line 9, in schedule 4, at end insert—

Amendment 187, page 186, line 32, in schedule 7, at end insert—

Human trafficking

22A An offence under section 2 of the Modern Slavery Act 2015.”

This amendment includes Human Trafficking as a priority offence.

Amendment 211, page 187, line 23, in schedule 7, at end insert—

Government new clause 14.

Government new clause 15.

Government amendments 83 to 84.

Amendment 156, page 53, line 7, in clause 57, leave out subsections (1) and (2) and insert—

‘(1) A provider of a Category 1 service must require all adult users of the service to verify their identity in order to access the service.

(2) The verification process—

(a) may be of any kind (and in particular, it need not require documentation to be provided),

(b) must—

(i) be carried out by a third party on behalf of the provider of the Category 1 service,

(ii) ensure that all anonymous users of the Category 1 service cannot be identified by other users, apart from where provided for by section (Duty to ensure anonymity of users).”

This amendment would require all users of Category 1 services to be verified. The verification process would have to be carried out by a third party and to ensure the anonymity of users.

Government amendment 101.

Amendment 193, page 58, line 33, in clause 65, at end insert—

“(ea) persons whom OFCOM consider to have expertise in matters relating to the Equality Act 2010,”

This amendment requires Ofcom to consult people with expertise on the Equality Act 2010 in respect of guidance about transparency reports.

Amendment 203, page 60, line 33, in clause 68, at end insert—

‘(2B) A duty to meet the conditions set out in Schedule [Additional duties on pornographic content].”

This amendment ensures that commercial pornographic websites must meet the new duties set out in NS1.

Government amendments 141, 177 to 184, 142 to 145, 185 to 186 and 146.

New schedule 1—Additional duties on pornographic content

“30 All user-to-user services and an internet service which provides regulated provider pornographic content must meet the following conditions for pornographic content and content that includes sexual photographs and films (“relevant content”).

The conditions are—

(a) the service must not contain any prohibited material,

(b) the service must review all relevant content before publication.

31 In this Schedule—

“photographs and films” has the same meaning as section 34 of the Criminal Justice and Courts Act 2015 (meaning of “disclose” and “photograph or film”)

“prohibited material” has the same meaning as section 368E(3) of the Communications Act 2003 (harmful material).”

The new schedule sets out additional duties for pornographic content which apply to user-to-user services under Part 3 and commercial pornographic websites under Part 5.

Government amendments 150 and 174.

Amendment 191, page 94, line 24, in clause 12, at end insert—

“Section [Category 1 services: duty not to discriminate against, harass or victimise service users] Duty not to discriminate against, harass or victimise

This amendment makes NC24 an enforceable requirement.

Government amendment 131.

Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - -

I welcome the new Minister to the Dispatch Box.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

Thank you, Mr Speaker. I am honoured to have been appointed the Minister responsible for the Online Safety Bill. Having worked on these issues for a number of years, I am well aware of the urgency and importance of this legislation, in particular to protect children and tackle criminal activity online—that is why we are discussing this legislation.

Relative to the point of order from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), I have the greatest respect for him and his standing in this House, but it feels like we have been discussing this Bill for at least five years. We have had a Green Paper and a White Paper. We had a pre-legislative scrutiny process, which I was honoured to be asked to chair. We have had reports from the Digital, Culture, Media and Sport Committee and from other Select Committees and all-party parliamentary groups of this House. This legislation does not want for scrutiny.

We have also had a highly collaborative and iterative process in the discussion of the Bill. We have had 66 Government acceptances of recommendations made by the Joint Committee on the draft Online Safety Bill. We have had Government amendments in Committee. We are discusssing Government amendments today and we have Government commitments to table amendments in the House of Lords. The Bill has received a huge amount of consultation. It is highly important legislation, and the victims of online crime, online fraud, bullying and harassment want to see us get the Bill into the Lords and on the statute book as quickly as possible.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

I completely agree with my right hon. and learned Friend. That is why the Bill passed Second Reading without a Division and the Joint Committee produced a unanimous report. I am happy for Members to cast me in the role of poacher turned gamekeeper on the Bill, but looking around the House, there are plenty of gamekeepers turned poachers here today who will ensure we have a lively debate.

Lindsay Hoyle Portrait Mr Speaker
- View Speech - Hansard - -

And the other way, as well.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Exactly. The concept at the heart of this legislation is simple. Tech companies, like those in every other sector, must take appropriate responsibility for the consequences of their business decisions. As they continue to offer their users the latest innovations that enrich our lives, they must consider safety as well as profit. They must treat their users fairly and ensure that the internet remains a place for robust debate. The Bill has benefited from input and scrutiny from right across the House. I pay tribute to my predecessor, my hon. Friend the Member for Croydon South (Chris Philp), who has worked tirelessly on the Bill, not least through 50 hours of Public Bill Committee, and the Bill is better for his input and work.

We have also listened to the work of other Members of the House, including my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the right hon. Member for Barking (Dame Margaret Hodge), my right hon. Friend the Member for Haltemprice and Howden and the Chair of the Select Committee, my hon. Friend the Member for Solihull (Julian Knight), who have all made important contributions to the discussion of the Bill.

We have also listened to those concerned about freedom of expression online. It is worth pausing on that, as there has been a lot of discussion about whether the Bill is censoring legal speech online and much understandable outrage from those who think it is. I asked the same questions when I chaired the Joint Committee on the Bill. This debate does not reflect the actual text of the Bill itself. The Bill does not require platforms to restrict legal speech—let us be absolutely clear about that. It does not give the Government, Ofcom or tech platforms the power to make something illegal online that is legal offline. In fact, if those concerned about the Bill studied it in detail, they would realise that the Bill protects freedom of speech. In particular, the Bill will temper the huge power over public discourse wielded by the big tech companies behind closed doors in California. They are unaccountable for the decisions they make on censoring free speech on a daily basis. Their decisions about what content is allowed will finally be subject to proper transparency requirements.

--- Later in debate ---
Diana Johnson Portrait Dame Diana Johnson (Kingston upon Hull North) (Lab)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Lindsay Hoyle Portrait Mr Deputy Speaker
- View Speech - Hansard - -

With this it will be convenient to discuss the following:

New clause 33—Meaning of “pornographic content”

“(1) In this Act ‘pornographic content’ means any of the following—

(a) a video work in respect of which the video works authority has issued an R18 certificate;

(b) content that was included in a video work to which paragraph (a) applies, if it is reasonable to assume from its nature that its inclusion was among the reasons why the certificate was an R18 certificate;

(c) any other content if it is reasonable to assume from its nature that any classification certificate issued in respect of a video work including it would be an R18 certificate;

(d) a video work in respect of which the video works authority has issued an 18 certificate, and that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal;

(e) content that was included in a video work to which paragraph (d) applies, if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that its inclusion was among the reasons why the certificate was an 18 certificate;

(f) any other content if it is reasonable to assume from its nature—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that any classification certificate issued in respect of a video work including it would be an 18 certificate;

(g) a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if—

(i) it includes content that it is reasonable to assume from its nature was produced solely or principally for the purposes of sexual arousal, and

(ii) it is reasonable to assume from the nature of that content that its inclusion was among the reasons why the video works authority made that determination;

(h) content that was included in a video work that the video works authority has determined not to be suitable for a classification certificate to be issued in respect of it, if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that its inclusion was among the reasons why the video works authority made that determination;

(i) any other content if it is reasonable to assume from the nature of the content—

(i) that it was produced solely or principally for the purposes of sexual arousal, and

(ii) that the video works authority would determine that a video work including it was not suitable for a classification certificate to be issued in respect of it.

(2) In this section—

‘18 certificate’ means a classification certificate which—

(a) contains, pursuant to section 7(2)(b) of the Video Recordings Act 1984, a statement that the video work is suitable for viewing only by persons who have attained the age of 18 and that no video recording containing that work is to be supplied to any person who has not attained that age, and

(b) does not contain the statement mentioned in section 7(2)(c) of that Act that no video recording containing the video work is to be supplied other than in a licensed sex shop;

‘classification certificate’ has the same meaning as in the Video Recordings Act 1984 (see section 7 of that Act);

‘content’ means—

(a) a series of visual images shown as a moving picture, with or without sound;

(b) a still image or series of still images, with or without sound; or

(c) sound;

‘R18 certificate’ means a classification certificate which contains the statement mentioned in section 7(2)(c) of the Video Recordings Act 1984 that no video recording containing the video work is to be supplied other than in a licensed sex shop;

‘the video works authority’ means the person or persons designated under section 4(1) of the Video Recordings Act 1984 as the authority responsible for making arrangements in respect of video works other than video games;

‘video work’ means a video work within the meaning of the Video Recordings Act 1984, other than a video game within the meaning of that Act.”

This new clause defines pornographic content for the purposes of the Act and would apply to user-to-user services and commercial pornographic content.

Amendment 205, in clause 34, page 33, line 23, at end insert—

“(3A) But an advertisement shall not be regarded as regulated user-generated content and precluded from being a ‘fraudulent advertisement’ by reason of the content constituting the advertisement being generated directly on, uploaded to, or shared on a user-to-user service before being modified to a paid-for advertisement.”

Amendment 206, page 33, line 30, after “has” insert

“or may reasonably be expected to have”.

Amendment 207, in clause 36, page 35, line 12, at end insert—

“(3A) An offence under section 993 of the Companies Act 2006 (fraudulent trading).”

Amendment 208, page 35, line 18, after “(3)” insert “, 3(A)”.

Amendment 209, page 35, line 20, after “(3)” insert “, 3(A)”

Amendment 210, page 35, line 23, after “(3)” insert “, 3(A)”

Amendment 201, in clause 66, page 59, line 8, leave out from “Pornographic content” to end of line 10 and insert

“has the same meaning as section [meaning of pornographic content]”.

This amendment defines pornographic content for the purposes of the Part 5. It is consequential on NC33.

Amendment 56, page 59, line 8, after “content” insert “, taken as a whole,”

This amendment would require that content is considered as a whole before being defined as pornographic content.

Amendment 33, in clause 68, page 60, line 33, at end insert—

“(2A) A duty to verify that every individual featured in regulated provider pornographic content is an adult before the content is published on the service.

(2B) A duty to verify that every individual featured in regulated provider pornographic content that is already published on the service when this Act is passed is an adult and, where that is not the case, remove such content from the service.

(2C) A duty to verify that each individual appearing in regulated provider pornographic content has given their permission for the content in which they appear to be published or made available by the internet service.

(2D) A duty to remove regulated provider pornographic content featuring an individual if that individual withdraws their consent, at any time, to the pornographic content in which they feature remaining on the service.”

This amendment creates a duty to verify that each individual featured in pornographic content is an adult and has agreed to the content being uploaded before it is published. It would also impose a duty to remove content if the individual withdraws consent at any time.

Amendment 34, page 60, line 37, leave out “subsection (2)” and insert “subsections (2) to (2D)”.

This amendment is consequential on Amendment 33.

Amendment 31, in clause 182, page 147, line 16, leave out from “unless” to end of line 17 and insert—

“(a) a draft of the instrument has been laid before each House of Parliament,

“(b) the Secretary of State has made a motion in the House of Commons in relation to the draft instrument, and

(c) the draft instrument has been approved by a resolution of each House of Parliament.”

This amendment would require a draft of a statutory instrument containing regulations under sections 53 or 54 to be debated on the floor of the House of Commons, rather than in a delegated legislation committee (as part of the affirmative procedure).

Amendment 158, in clause 192, page 155, line 26, after “including” insert “but not limited to”.

This amendment clarifies that the list of types of content in clause 192 is not exhaustive.

Diana Johnson Portrait Dame Diana Johnson
- View Speech - Hansard - - - Excerpts

May I welcome the Minister to his place, as I did not get an opportunity to speak on the previous group of amendments?

New clause 7 and amendments 33 and 34 would require online platforms to verify the age and consent of all individuals featured in pornographic videos uploaded to their site, as well as enabling individuals to withdraw their consent to the footage remaining on the website. Why are the amendments necessary? Let me read a quotation from a young woman:

“I sent Pornhub begging emails. I pleaded with them. I wrote, ‘Please, I’m a minor, this was assault, please take it down.’”

She received no reply and the videos remained live. That is from a BBC article entitled “I was raped at 14, and the video ended up on a porn site”.

This was no one-off. Some of the world’s biggest pornography websites allow members of the public to upload videos without verifying that everyone in the film is an adult or that everyone in the film gave their permission for it to be uploaded. As a result, leading pornography websites have been found to be hosting and profiting from filmed footage of rape, sex trafficking, image-based sexual abuse and child sexual abuse.

In 2020, The New York Times documented the presence of child abuse videos on Pornhub, one of the most popular pornography websites in the world, prompting Mastercard, Visa and Discover to block the use of their cards for purchases on the site. The New York Times reporter Nicholas Kristof wrote about Pornhub:

“Its site is infested with rape videos. It monetizes child rapes, revenge pornography, spy cam videos of women showering, racist and misogynist content, and footage of women being asphyxiated in plastic bags.”

Even before that, in 2019, PayPal took the decision to stop processing payments for Pornhub after an investigation by The Sunday Times revealed that the site contained child abuse videos and other illegal content. The newspaper reported:

“Pornhub is awash with secretly filmed ‘creepshots’ of schoolgirls and clips of men performing sex acts in front of teenagers on buses. It has also hosted indecent images of children as young as three.

The website says it bans content showing under-18s and removes it swiftly. But some of the videos identified by this newspaper’s investigation had 350,000 views and had been on the platform for more than three years.”

One of the women who is now being forced to take legal action against Pornhub’s parent company, MindGeek, is Crystal Palace footballer Leigh Nicol. Leigh’s phone was hacked and private content was uploaded to Pornhub without her knowledge. She said in an interview:

“The damage is done for me so this is about the next generation. I feel like prevention is better than someone having to react to this. I cannot change it alone but if I can raise awareness to stop it happening to others then that is what I want to do…The more that you dig into this, the more traumatising it is because there are 14-year-old kids on these websites and they don’t even know about it. The fact that you can publish videos that have neither party’s consent is something that has to be changed by law, for sure.”

Leigh Nicol is spot on.

Unfortunately, when this subject was debated in Committee, the previous Minister, the hon. Member for Croydon South (Chris Philp), argued that the content I have described—including child sexual abuse images and videos—was already illegal, and there was therefore no need for the Government to introduce further measures. However, that misses the point: the Minister was arguing against the very basis of his own Government’s Bill. At the core of the Bill, as I understand it, is a legal duty placed on online platforms to combat and remove content that is already illegal, such as material relating to terrorism. ln keeping with that, my amendments would place a legal duty on online platforms hosting pornographic content to combat and remove illegal content through the specific and targeted measure of verifying the age and consent of every individual featured in pornographic content on their sites. The owners and operators of pornography websites are getting very rich from hosting footage of rape, trafficking and child sexual abuse, and they must be held to account under the law and required to take preventive action.

The Organisation for Security and Co-operation in Europe, which leads action to combat human trafficking across 57 member states, recommends that Governments require age and consent verification on pornography websites in order to combat exploitation. The OSCE told me:

“These sites routinely feature sexual violence, exploitation and abuse, and trafficking victims. Repeatedly these sites have chosen profits over reasonable prevention and protection measures. At the most basic level, these sites should be required to ensure that each person depicted is a consenting adult, with robust age verification and the right to withdraw consent at any time. Since self- regulation hasn’t worked, this will only work through strong, state-led regulation”.

Who else supports that? Legislation requiring online platforms to verify the age and consent of all individuals featured in pornographic content on their sites is backed by leading anti-sexual exploitation organisations including CEASE—the Centre to End All Sexual Exploitation—UK Feminista and the Traffickinghub movement, which has driven the global campaign to expose the abuses committed by, in particular, Pornhub.

New clause 7 and amendments 33 and 34 are minimum safety measures that would stop the well-documented practice of pornography websites hosting and profiting from videos of rape, trafficking and child sexual abuse. I urge the Government to reconsider their position, and I will seek to test the will of the House on new clause 7 later this evening.