[2nd Allocated Day]
[Relevant documents: Report of the Joint Committee on the Draft Online Safety Bill, Session 2021-22: Draft Online Safety Bill, HC 609, and the Government Response, CP 640; First Report of the Digital, Cultural, Media and Sport Committee, Amending the Online Safety Bill, HC 271; Second Report of the Petitions Committee, Session 2021-22, Tackling Online Abuse, HC 766, and the Government response, HC 1224; Letter from the Minister for Tech and the Digital Economy to the Chair of the Joint Committee on Human Rights relating to the Online Safety Bill, dated 16 June 2022; Letter from the Chair of the Joint Committee on Human Rights to the Secretary of State for Digital, Culture, Media and Sport relating to the Online Safety Bill, dated 19 May 2022; e-petition 272087, Hold online trolls accountable for their online abuse via their IP address; e-petition 332315, Ban anonymous accounts on social media; e-petition 575833, Make verified ID a requirement for opening a social media account; e-petition 582423, Repeal Section 127 of the Communications Act 2003 and expunge all convictions e-petition 601932, Do not restrict our right to freedom of expression online.]
Further consideration of Bill, as amended in the Public Bill Committee
Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

Before I call the Minister to open the debate, I have something to say about the scope of today’s debate. This is day 2 of debate on consideration of the Bill as amended in the Public Bill Committee. We are debating today only the new clauses, amendments and new schedules listed on the selection paper that I have issued today.

Members may be aware that the Government have tabled a programme motion that would recommit certain clauses and schedules to a Public Bill Committee. There will be an opportunity to debate that motion following proceedings on consideration. The Government have also published a draft list of proposed amendments to the Bill that they intend to bring forward during the recommittal process. These amendments are not in scope for today. There will be an opportunity to debate, at a future Report stage, the recommitted clauses and schedules, as amended on recommittal in the Public Bill Committee.

Most of today’s amendments and new clauses do not relate to the clauses and schedules that are being recommitted. These amendments and new clauses have been highlighted on the selection paper. Today will be the final chance for the Commons to consider them: there will be no opportunity for them to be tabled and considered again at any point during the remaining Commons stages.

New Clause 11

Notices to deal with terrorism content or CSEA content (or both)

“(1) If OFCOM consider that it is necessary and proportionate to do so, they may give a notice described in subsection (2), (3) or (4) relating to a regulated user-to-user service or a regulated search service to the provider of the service.

(2) A notice under subsection (1) that relates to a regulated user-to-user service is a notice requiring the provider of the service—

(a) to do any or all of the following—

(i) use accredited technology to identify terrorism content communicated publicly by means of the service and to swiftly take down that content;

(ii) use accredited technology to prevent individuals from encountering terrorism content communicated publicly by means of the service;

(iii) use accredited technology to identify CSEA content, whether communicated publicly or privately by means of the service, and to swiftly take down that content;

(iv) use accredited technology to prevent individuals from encountering CSEA content, whether communicated publicly or privately, by means of the service; or

(b) to use the provider’s best endeavours to develop or source technology for use on or in relation to the service or part of the service, which—

(i) achieves the purpose mentioned in paragraph (a)(iii) or (iv), and

(ii) meets the standards published by the Secretary of State (see section 106(10)).

(3) A notice under subsection (1) that relates to a regulated search service is a notice requiring the provider of the service—

(a) to do either or both of the following—

(i) use accredited technology to identify search content of the service that is terrorism content and to swiftly take measures designed to secure, so far as possible, that search content of the service no longer includes terrorism content identified by the technology;

(ii) use accredited technology to identify search content of the service that is CSEA content and to swiftly take measures designed to secure, so far as possible, that search content of the service no longer includes CSEA content identified by the technology; or

(b) to use the provider’s best endeavours to develop or source technology for use on or in relation to the service which—

(i) achieves the purpose mentioned in paragraph (a)(ii), and

(ii) meets the standards published by the Secretary of State (see section 106(10)).

(4) A notice under subsection (1) that relates to a combined service is a notice requiring the provider of the service—

(a) to do any or all of the things described in subsection (2)(a) in relation to the user-to-user part of the service, or to use best endeavours to develop or source technology as described in subsection (2)(b) for use on or in relation to that part of the service;

(b) to do either or both of the things described in subsection (3)(a) in relation to the search engine of the service, or to use best endeavours to develop or source technology as described in subsection (3)(b) for use on or in relation to the search engine of the service;

(c) to do any or all of the things described in subsection (2)(a) in relation to the user-to-user part of the service and either or both of the things described in subsection (3)(a) in relation to the search engine of the service; or

(d) to use best endeavours to develop or source—

(i) technology as described in subsection (2)(b) for use on or in relation to the user-to-user part of the service, and

(ii) technology as described in subsection (3)(b) for use on or in relation to the search engine of the service.

(5) For the purposes of subsections (2) and (3), a requirement to use accredited technology may be complied with by the use of the technology alone or by means of the technology together with the use of human moderators.

(6) See—

(a) section (Warning notices), which requires OFCOM to give a warning notice before giving a notice under subsection (1), and

(b) section 105 for provision about matters which OFCOM must consider before giving a notice under subsection (1).

(7) A notice under subsection (1) relating to terrorism content present on a service must identify the content, or parts of the service that include content, that OFCOM consider is communicated publicly on that service (see section 188).

(8) For the meaning of “accredited” technology, see section 106(9) and (10).”—(Julia Lopez.)

This clause replaces existing clause 104. The main changes are: for user-to-user services, a notice may require the use of accredited technology to prevent individuals from encountering terrorism or CSEA content; for user-to-user and search services, a notice may require a provider to use best endeavours to develop or source technology to deal with CSEA content.

Brought up, and read the First time.

15:33
Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

With this it will be convenient to discuss the following:

Government new clause 12—Warning notices.

Government new clause 20—OFCOM’s reports about news publisher content and journalistic content.

Government new clause 40—Amendment of Enterprise Act 2002.

Government new clause 42—Former providers of regulated services.

Government new clause 43—Amendments of Part 4B of the Communications Act.

Government new clause 44—Repeal of Part 4B of the Communications Act: transitional provision etc.

Government new clause 51—Publication by providers of details of enforcement action.

Government new clause 52—Exemptions from offence under section 152.

Government new clause 53—Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2).

New clause 1—Provisional re-categorisation of a Part 3 service

“(1) This section applies in relation to OFCOM’s duty to maintain the register of categories of regulated user-to-user services and regulated search services under section 83.

(2) If OFCOM—

(a) consider that a Part 3 service not included in a particular part of the register is likely to meet the threshold conditions relevant to that part, and

(b) reasonably consider that urgent application of duties relevant to that part is necessary to avoid or mitigate significant harm,

New clause 16—Communication offence for encouraging or assisting self-harm

“(1) In the Suicide Act 1961, after section 3 insert—

“3A Communication offence for encouraging or assisting self-harm

(1) A person (“D”) commits an offence if—

(a) D sends a message,

(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and

(c) D’s act was intended to encourage or assist the infliction of serious physical harm.

(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.

(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.

(4) A person guilty of an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;

(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.

(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.

(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.

(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).

(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—

(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and

(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and

(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.””

This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.

New clause 17—Liability of directors for compliance failure

“(1) This section applies where OFCOM considers that there are reasonable grounds for believing that a provider of a regulated service has failed, or is failing, to comply with any enforceable requirement (see section 112) that applies in relation to the service.

(2) If OFCOM considers that the failure results from any—

(a) action,

(b) direction,

(c) neglect, or

(d) with the consent

This new clause would enable Ofcom to exercise its enforcement powers under Chapter 6, Part 7 of the Bill against individual directors, managers and other officers at a regulated service provider where it considers the provider has failed, or is failing, to comply with any enforceable requirement.

New clause 23—Financial support for victims support services

“(1) The Secretary of State must by regulations make provision for penalties paid under Chapter 6 to be used for funding for victims support services.

(2) Those regulations must—

(a) specify criteria setting out which victim support services are eligible for financial support under this provision;

(b) set out a means by which the amount of funding available should be determined;

(c) make provision for the funding to be reviewed and allocated on a three year basis.

(3) Regulations under this section—

(a) shall be made by statutory instrument, and

(b) may not be made unless a draft has been laid before and approved by resolution of each House of Parliament.”

New clause 28—Establishment of Advocacy Body

“(1) There is to be a body corporate (“the Advocacy Body”) to represent interests of child users of regulated services.

(2) A “child user”—

(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and

(b) includes both any existing child user and any future child user.

(3) The work of the Advocacy Body may include—

(a) representing the interests of child users;

(b) the protection and promotion of these interests;

(c) any other matter connected with those interests.

(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act, including—

(a) safety duties about illegal content, in particular CSEA content;

(b) safety duties protecting children;

(c) “enforceable requirements” relating to children.

(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.

(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.

(7) The Advocacy Body must assess emerging threats to child users of regulated services and must bring information regarding these threats to OFCOM.

(8) The Advocacy Body may undertake research on their own account.

(9) The Secretary of State must either appoint an organisation known to represent children to be designated the functions under this Act, or create an organisation to carry out the designated functions.

(10) The budget of the Advocacy Body will be subject to annual approval by the board of OFCOM.

(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 71).”

New clause 29—Duty to promote media literacy: regulated user-to-user services and search services

“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.

(2) This section applies only in relation to OFCOM’s duty to regulate—

(a) user-to-user services, and

(b) search services.

(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—

(a) to reach audiences who are less engaged with, and harder to reach through, traditional media literacy initiatives;

(b) to address gaps in the availability and accessibility of media literacy provisions targeted at vulnerable users;

(c) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;

(d) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by—

(i) carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public;

(ii) seeking to ensure, through the exercise of OFCOM’s online safety functions, that providers of regulated services take appropriate measures to improve users’ media literacy;

(iii) seeking to improve the evaluation of the effectiveness of the initiatives and measures mentioned in sub paras (2)(d)(i) and (ii) (including by increasing the availability and adequacy of data to make those evaluations);

(e) to promote better coordination within the media literacy sector.

(4) OFCOM may prepare such guidance about the matters referred to in subsection (2) as it considers appropriate.

(5) Where OFCOM prepares guidance under subsection (4) it must—

(a) publish the guidance (and any revised or replacement guidance); and

(b) keep the guidance under review.

(6) OFCOM must co-operate with the Secretary of State in the exercise and performance of their duty under this section.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 30—Media literacy strategy

“(1) OFCOM must prepare a strategy which sets out how they intend to undertake their duty to promote media literacy in relation to regulated user-to-user services and regulated search services under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The strategy must—

(a) set out the steps OFCOM propose to take to achieve the pursuit of the objectives set out in section (Duty to promote media literacy: regulated user-to-user services and search services),

(b) set out the organisations, or types of organisations, that OFCOM propose to work with in undertaking the duty;

(c) explain why OFCOM considers that the steps it proposes to take will be effective;

(d) explain how OFCOM will assess the extent of the progress that is being made under the strategy.

(3) In preparing the strategy OFCOM must have regard to the need to allocate adequate resources for implementing the strategy.

(4) OFCOM must publish the strategy within the period of 6 months beginning with the day on which this section comes into force.

(5) Before publishing the strategy (or publishing a revised strategy), OFCOM must consult—

(a) persons with experience in or knowledge of the formulation, implementation and evaluation of policies and programmes intended to improve media literacy;

(b) the advisory committee on disinformation and misinformation, and

(c) any other person that OFCOM consider appropriate.

(6) If OFCOM have not revised the strategy within the period of 3 years beginning with the day on which the strategy was last published, they must either—

(a) revise the strategy, or

(b) publish an explanation of why they have decided not to revise it.

(7) If OFCOM decides to revise the strategy they must—

(a) consult in accordance with subsection (3), and

(b) publish the revised strategy.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 31—Research conducted by regulated services

“(1) OFCOM may, at any time it considers appropriate, produce a report into how regulated services commission, collate, publish and make use of research.

(2) For the purposes of the report, OFCOM may require services to submit to OFCOM—

(a) a specific piece of research held by the service, or

(b) all research the service holds on a topic specified by OFCOM.”

New clause 34—Factual Accuracy

“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.

(2) Any Regulated Service must provide an index of the historic factual accuracy of material published by each user who has—

(a) produced user-generated content,

(b) news publisher content, or

(c) comments and reviews on provider contact

(3) The index under subsection (1) must—

(a) satisfy minimum quality criteria to be set by OFCOM, and

(b) be displayed in a way which allows any user easily to reach an informed view of the likely factual accuracy of the content at the same time as they encounter it.”

New clause 35—Duty of balance

“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.

(2) Any Regulated Service which selects or prioritises particular—

(a) user-generated content,

(b) news publisher content, or

(c) comments and reviews on provider content

New clause 36—Identification of information incidents by OFCOM

“(1) OFCOM must maintain arrangements for identifying and understanding patterns in the presence and dissemination of harmful misinformation and disinformation on regulated services.

(2) Arrangements for the purposes of subsection (1) must in particular include arrangements for—

(a) identifying, and assessing the severity of, actual or potential information incidents; and

(b) consulting with persons with expertise in the identification, prevention and handling of disinformation and misinformation online (for the purposes of subsection (2)(a)).

(3) Where an actual or potential information incident is identified, OFCOM must as soon as reasonably practicable—

(a) set out any steps that OFCOM plans to take under its online safety functions in relation to that situation; and

(b) publish such recommendations or other information that OFCOM considers appropriate.

(4) Information under subsection (3) may be published in such a manner as appears to OFCOM to be appropriate for bringing it to the attention of the persons who, in OFCOM’s opinion, should be made aware of it.

(5) OFCOM must prepare and issue guidance about how it will exercise its functions under this section and, in particular—

(a) the matters it will take into account in determining whether an information incident has arisen;

(b) the matters it will take into account in determining the severity of an incident; and

(c) the types of responses that OFCOM thinks are likely to be appropriate when responding to an information incident.

(6) For the purposes of this section—

“harmful misinformation or disinformation” means misinformation or disinformation which, taking into account the manner and extent of its dissemination, may have a material adverse effect on users of regulated services or other members of the public;

“information incident” means a situation where it appears to OFCOM that there is a serious or systemic dissemination of harmful misinformation or disinformation relating to a particular event or situation.”

This new clause would insert a new clause into the Bill to give Ofcom a proactive role in identifying and responding to the sorts of information incidents that can occur in moments of crisis.

New clause 37—Duty to promote media literacy: regulated user-to-user services and search services

“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.

(2) This section applies only in relation to OFCOM’s duty to regulate—

(a) user-to-user services, and

(b) search services.

(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—

(a) to encourage the development and use of technologies and systems in relation to user-to-user services and search services which help to improve the media literacy of members of the public, including in particular technologies and systems which—

(i) indicate the nature of content on a service (for example, show where it is an advertisement);

(ii) indicate the reliability and accuracy of the content; and

(iii) facilitate control over what content is received;

(b) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;

(c) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public.

(4) OFCOM must prepare guidance about—

(a) the matters referred to in subsection (3) as it considers appropriate; and

(b) minimum standards that media literacy initiatives must meet.

(5) Where OFCOM prepares guidance under subsection (4) it must—

(a) publish the guidance (and any revised or replacement guidance); and

(b) keep the guidance under review.

(6) Every report under paragraph 12 of the Schedule to the Office of Communications Act 2002 (OFCOM’s annual report) for a financial year must contain a summary of the steps that OFCOM have taken under subsection (1) in that year.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 45—Sharing etc intimate photographs or film without consent

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) A person (A) does not commit an offence under this section if A shares a photograph or film of another person (B) with B or a third person (C) if—

(a) the photograph or film only shows activity that would be ordinarily seen on public street, except for a photograph or film of breastfeeding;

(b) the photograph or film was taken in public, where the person depicted was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;

(c) A reasonably believed that the photograph or film, taken in public, showed a person depicted who was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;

(d) the photograph or film has been previously shared with consent in public;

(e) A reasonably believed that the photograph or film had been previously shared with consent in public;

(f) the photograph or film shows a young child and is of a kind ordinarily shared by family and friends;

(g) the photograph or film is of a child shared for that child’s medical care or treatment, where there is parental consent.

(4) A person (A) does not commit an offence under this section if A shares information about where to access a photograph or film where this photograph or film has already been made available to A.

(5) It is a defence for a person charged with an offence under this section to prove that they—

(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime;

(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings;

(c) reasonably believed that the sharing was necessary for the administration of justice;

(d) reasonably believed that the sharing was necessary for a genuine medical, scientific or educational purpose; and

(e) reasonably believed that the sharing was in the public interest.

(6) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(7) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(8) “Photograph” includes the negative as well as the positive version.

(9) “Film” means a moving image.

(10) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(11) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(12) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding 6 months or a fine (or both).”

This new clause creates the offence of sharing an intimate image without consent, providing the necessary exclusions such as for children’s medical care or images taken in public places, and establishing the penalty as triable by magistrates only with maximum imprisonment of 6 months.

New clause 46—Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents; and

(c) A intends that the subject of the photograph or film will be caused alarm, distress or humiliation by the sharing of the photograph or film.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(4) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(5) “Photograph” includes the negative as well as the positive version.

(6) “Film” means a moving image.

(7) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(9) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 47—Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents; and

(c) A shared the photograph or film for the purpose of obtaining sexual gratification (whether for the sender or recipient).

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(4) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(5) “Photograph” includes the negative as well as the positive version.

(6) “Film” means a moving image.

(7) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(9) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 48—Threatening to share etc intimate photographs or film

“(1) A person (A) commits an offence if—

(a) A threatens to share an intimate photograph or film of another person (B) with B or a third person (C); and

(i) A intends B to fear that the threat will be carried out; or A is reckless as to whether B will fear that the threat will be carried out.

(2) “Threatening to share” should be read to include threatening to share an intimate photograph or film that does not exist and other circumstances where it is impossible for A to carry out the threat.

(3) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(4) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(5) References to sharing, or threatening to share, such a photograph or film with another person include—

(a) sending, or threatening to send, it to another person by any means, electronically or otherwise;

(b) showing, or threatening to show, it to another person;

(c) placing, or threatening to place, it for another person to find; or

(d) sharing, or threatening to share, it on or uploading it to a user-to-user service, including websites or online public forums.

(6) “Photograph” includes the negative as well as the positive version.

(7) “Film” means a moving image.

(8) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(9) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(10) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates another more serious offence of threatening to share an intimate image, regardless of whether such an image actually exists, and where the sender intends to cause fear, or is reckless to whether they would cause fear, punishable by 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 49—Special measures in criminal proceedings for offences involving the sharing of intimate images

“(1) Chapter 1 of Part 2 of the Youth Justice and Criminal Evidence Act 1999 (giving of evidence or information for purposes of criminal proceedings: special measures directions in case of vulnerable and intimidated witnesses) is amended as follows.

(2) In section 17 (witnesses eligible for assistance on grounds of fear or distress about testifying), in subsection (4A) after paragraph (b) insert “(c) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”

This new clause inserts intimate image abuse into legislation that qualifies victims for special measures when testifying in court (such as partitions to hide them from view, video testifying etc.) which is already prescribed by law.

New clause 50—Anonymity for victims of offences involving the sharing of intimate images

“(1) Section 2 of the Sexual Offences (Amendment) Act 1992 (Offences to which this Act applies) is amended as follows.

(2) In subsection 1 after paragraph (db) insert—

(dc) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”

Similar to NC49, this new clause allows victims of intimate image abuse the same availability for anonymity as other sexual offences to protect their identities and give them the confidence to testify against their abuser without fear of repercussions.

New clause 54—Report on the effect of Virtual Private Networks on OFCOM’s ability to enforce requirements

“(1) The Secretary of State must publish a report on the effect of the use of Virtual Private Networks on OFCOM’s ability to enforce requirements under section 112.

(2) The report must be laid before Parliament within six months of the passing of this Act.”

New clause 55—Offence of sending communication facilitating modern slavery and illegal immigration

‘(1) A person (A) commits an offence if—

(a) (A) intentionally shares with a person (B) or with a third person (C) a photograph or film which is reasonably considered to be, or to be intended to be, facilitating or promoting any activities which do, or could reasonably be expected to, give rise to an offence under—

(i) sections 1 (Slavery, servitude and forced labour), 2 (Human trafficking) or 4 (Committing offence with intent to commit an offence under section 2) of the Modern Slavery Act 2015; or

(ii) sections 24 (Illegal Entry and Similar Offences) or 25 (Assisting unlawful immigration etc) of the Immigration Act 1971; and

(a) (A) does so knowing, or when they reasonably ought to have known, that the activities being depicted are unlawful.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) A person (A) does not commit an offence under this section if—

(a) the sharing is undertaken by or on behalf of a journalist or for journalistic purposes;

(b) the sharing is by a refugee organisation registered in the UK and which falls within the scope of sub-section (3) or section 25A of the Immigration Act 1971;

(c) the sharing is by or on behalf of a duly elected Member of Parliament or other elected representative in the UK.

(4) It is a defence for a person charged under this section to provide that they—

(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime and

(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings.

(5) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both).”

This new clause would create a new criminal offence of intentionally sharing a photograph or film that facilitates or promotes modern slavery or illegal immigration.

Government amendments 234 and 102 to 117.

Amendment 195, in clause 104, page 87, line 10, leave out subsection 1 and insert—

“(1) If OFCOM consider that it is necessary and proportionate to do so, they may—

(a) give a notice described in subsection (2), (3) or (4) relating to a regulated user to user service or a regulated search service to the provider of the service;

(b) give a notice described in subsection (2), (3) or (4) to a provider or providers of Part 3 services taking into account risk profiles produced by OFCOM under section 84.”

Amendment 152, page 87, line 18, leave out ‘whether’.

This amendment is consequential on Amendment 153.

Amendment 153, page 87, line 19, leave out ‘or privately’.

This amendment removes the ability to monitor encrypted communications.

Government amendment 118.

Amendment 204, in clause 105, page 89, line 17, at end insert—

“(ia) the level of risk of the use of the specified technology accessing, retaining or disclosing the identity or provenance of any confidential journalistic source or confidential journalistic material.”

This amendment would require Ofcom to consider the risk of the use of accredited technology by a Part 3 service accessing, retaining or disclosing the identity or provenance of journalistic sources or confidential journalistic material, when deciding whether to give a notice under Clause 104(1) of the Bill.

Government amendments 119 to 130, 132 to 134, 212, 213, 135 and 214.

Amendment 23, in clause 130, page 114, line 3, leave out paragraph (a).

Government amendment 175.

Amendment 160, in clause 141, page 121, line 9, leave out subsection (2).

This amendment removes the bar of conditionality that must be met for super complaints that relate to a single regulated service.

Amendment 24, page 121, line 16, leave out “The Secretary of State” and insert “OFCOM”.

Amendment 25, page 121, line 21, leave out from “(3),” to end of line 24 and insert “OFCOM must consult—

“(a) The Secretary of State, and

“(b) such other persons as OFCOM considers appropriate.”

This amendment would provide that regulations under clause 141 are to be made by OFCOM rather than by the Secretary of State.

Amendment 189, in clause 142, page 121, line 45, leave out from “including” to end of line 46 and insert

“90 day maximum time limits in relation to the determination and notification to the complainant of—”.

This requires the Secretary of State’s guidance to require Ofcom to determine whether a complaint is eligible for the super-complaints procedure within 90 days.

Amendment 26, in clause 146, page 123, line 33, leave out

“give OFCOM a direction requiring”

and insert “may make representations to”.

Amendment 27, page 123, line 36, leave out subsection (2) and insert—

“(2) OFCOM must have due regard to any representations made by the Secretary of State under subsection (1).”

Amendment 28, page 123, line 38, leave out from “committee” to end of line 39 and insert

“established under this section is to consist of the following members—”.

Amendment 29, page 124, line ], leave out from “committee” to “publish” in line 2 and insert

“established under this section must”.

Amendment 30, page 124, line 4, leave out subsection (5).

Amendment 32, page 124, line 4, leave out clause 148.

Government amendments 176, 239, 138, 240, 215, 241, 242, 217, 218, 243, 219, 244, 245, 220, 221, 140, 246, 222 to 224, 247, 225, 248, 226 and 227.

Amendment 194, in clause 157, page 131, line 16, leave out from beginning to end of line 17 and insert—

“(a) B has not consented for A to send or give the photograph or film to B, and”.

Government amendments 249 to 252, 228, 229 and 235 to 237.

Government new schedule 2—Amendments of Part 4B of the Communications Act.

Government new schedule 3—Video-sharing platform services: transitional provision etc.

Government amendment 238

Amendment 35, schedule 11, page 198, line 5, leave out “The Secretary of State” and insert “OFCOM”.

This amendment would give the power to make regulations under Schedule 11 to OFCOM.

Amendment 2, page 198, line 9, leave out “functionalities” and insert “characteristics”.

Amendment 1, page 198, line 9, at end insert—

“(1A) In this schedule, “characteristics” of a service include its functionalities, user base, business model, governance and other systems and processes.”

Amendment 159, page 198, line 9, at end insert—

“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”

This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.

Amendment 36, page 198, line 10, leave out “The Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 37, page 198, line 16, leave out “The Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 3, page 198, line 2, leave out “functionalities” and insert “characteristics”.

Amendment 9, page 198, line 28, leave out “and” and insert “or”.

Amendment 4, page 198, line 29, leave out “functionality” and insert “characteristic”.

Amendment 38, page 198, line 32, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 5, page 198, line 34, leave out “functionalities” and insert “characteristics”.

Amendment 39, page 198, line 37, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 40, page 198, line 41, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 6, page 198, line 4, leave out “functionalities” and insert “characteristics”.

Amendment 7, page 199, line 11, leave out “functionalities” and insert “characteristics”.

Amendment 8, page 199, line 28, leave out “functionalities” and insert “characteristics”.

Amendment 41, page 199, line 3, leave out subparagraphs (5) to (11).

This amendment is consequential on Amendment 35.

Government amendments 230, 253 to 261 and 233.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I was about to speak to the programme motion, Mr Speaker, but you have outlined exactly what I was going to say, so thank you for that—I am glad to get the process right.

I am delighted to bring the Online Safety Bill back to the House for the continuation of Report stage. I start by expressing my gratitude to colleagues across the House for their contributions to the Bill through pre-legislative scrutiny and before the summer recess, and for their engagement with me since I took office as the Minister for Tech and the Digital Economy.

The concept at the heart of this legislation is simple: tech companies, like those in every other sector, must take responsibility for the consequences of their business decisions. As they continue to offer users the latest innovations, they must consider the safety of their users as well as profit. They must treat their users fairly and ensure that the internet remains a place for free expression and robust debate. As Members will be aware, the majority of the Bill was discussed on Report before the summer recess. Our focus today is on the provisions that relate to the regulator’s power and the criminal law reforms. I will take this opportunity also to briefly set out the further changes that the Government recently committed to making later in the Bill’s passage.

Let me take the Government amendments in turn. The Government’s top priority for this legislation has always been the protection of children. We recognise that the particularly abhorrent and pernicious nature of online child sexual exploitation and abuse—CSEA—demands the most robust response possible. Throughout the passage of the Bill, we have heard evidence of the appalling harm that CSEA causes. Repeatedly, we heard calls for strong incentives for companies to do everything they can to innovate and make safety technologies their priority, to ensure that there is no place for offenders to hide online. The Bill already includes a specific power to tackle CSEA, which allows Ofcom, subject to safeguards, to require tech companies to use accredited technology to identify and remove illegal CSEA content in public and private communications. However, we have seen in recent years how the online world has evolved to allow offenders to reach their victims and one another in new ways.

Priti Patel Portrait Priti Patel (Witham) (Con)
- Hansard - - - Excerpts

I am listening to my hon. Friend with great interest on this aspect of child sexual abuse and exploitation, which is a heinous crime. Will he go on to speak about how the Ofcom role will interact with law enforcement, in particular the National Crime Agency, when dealing with these awful crimes?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

It is important that we tackle this in a number of ways. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and I spoke earlier, and I will come to some of what he will outline. It is important that Ofcom recognises the technologies that are available and—with the Children’s Commissioner as one of the statutory consultees—liaises with the social media platforms, and the agencies, to ensure that there are codes of practice that work, and that we get this absolutely right. It is about enforcing the terms and conditions of the companies and being able to produce the evidence and track the exchanges, as I will outline later, for the agency to use for enforcement.

With the rapid developments in technology, on occasions there will be no existing accredited technology available that will satisfactorily mitigate the risks. Similarly, tech companies might be able to better design solutions that integrate more easily with their services than those that are already accredited. The new regulatory framework must incentivise tech companies to ensure that their safety measures keep pace with the evolving threat, and that they design their services to be safe from the outset. It is for these reasons that the Government have tabled the amendments that we are discussing.

New clauses 11 and 12 establish options for Ofcom when deploying its powers under notices to deal with terrorism content and CSEA content. These notices will empower Ofcom to require companies to use accredited technology to identify and remove illegal terrorism and CSEA content or to prevent users from encountering that content or, crucially, to use their best endeavours to develop or to source technology to tackle CSEA. That strikes the right balance of supporting the adoption of new technology, while ensuring that it does not come at the expense of children’s physical safety.

Rehman Chishti Portrait Rehman Chishti (Gillingham and Rainham) (Con)
- Hansard - - - Excerpts

Terrorism is often linked to non-violent extremism, which feeds into violent extremism and terrorism. How does the Bill define extremism? Previous Governments failed to define it, although it is often linked to terrorism.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

This Bill links with other legislation, and obviously the agencies. We do not seek to redefine extremism where those definitions already exist. As we expand on the changes that we are making, we will first ensure that anything that is already illegal goes off the table. Anything that is against the terms and conditions of those platforms that are hosting that content must not be seen. I will come to the safety net and user protection later.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

Since Elon Musk’s takeover of Twitter, hate speech has ballooned on the platform and the number of staff members at Twitter identifying images of child sexual abuse and exploitation has halved. How can the Minister be sure that the social media companies are able to mark their own homework in the way that he suggests?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Because if those companies do not, they will get a fine of up to £18 million or 10% of their global turnover, whichever is higher. As we are finding with Twitter, there is also a commercial impetus, because advertisers are fleeing that platform as they see the uncertainty being caused by those changes. A lot of things are moving here to ensure that safety is paramount; it is not just for the Government to act in this area. All we are doing is making sure that those companies enforce their own terms and conditions.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

This point is important: we are speaking about terrorism and counter-terrorism and the state’s role in preventing terrorist activity. For clarity, will the Minister update the House later on the work that takes place between his Department and the platforms and, importantly, between the Home Office and the security services. In particular, some specialist work takes place with the Global Internet Forum to Counter Terrorism, which looks at online terrorist and extremist content. That work can ensure that crimes are prevented and that the right kinds of interventions take place.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My right hon. Friend talks with experience from her time at the Home Office. She is absolutely right that the Bill sets a framework to adhere to the terms and conditions of the platforms. It also sets out the ability for the services to look at things such as terrorism and CSEA, which I have been talking about—for example, through the evidence of photos being exchanged. The Bill is not re-examining and re-prosecuting the interaction between all the agencies, however, because that is apparent for all to see.

New clauses 11 and 12 bring those powers in line with the wider safety duties by making it clear that the tools may seek to proactively prevent CSEA content from appearing on a service, rather than focusing only on identification and removal after the fact. That will ensure the best possible protection for children, including on services that offer livestreaming.

The safeguards around those powers remain as strong as before to protect user privacy. Any tools that are developed will be accredited using a rigorous assessment process to ensure that they are highly accurate before the company is asked to use them. That will avoid any unnecessary intrusions into user privacy by minimising the risk that the tools identify false positives.

Crucially, the powers do not represent a ban on or seek to undermine any specific type of technology or design, such as end-to-end encryption. They align with the UK Government’s view that online privacy and cyber-security must be protected, but that technological changes should not be implemented in a way that diminishes public safety.

Kit Malthouse Portrait Kit Malthouse (North West Hampshire) (Con)
- Hansard - - - Excerpts

Can the Minister expand on the notion of “accredited technology”? The definition in the Bill is pretty scant as to where it will emerge from. Is he essentially saying that he is relying on the same industry that has thus far presided over the problem to produce the technology that will police it for us? Within that equation, which seems a little self-defeating, is it the case that if the technology does not emerge for one reason or another—commercial or otherwise—the Government will step in and devise, fund or otherwise create the technology required to be implemented?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my right hon. Friend. It is the technology sector that develops technology—it is a simple, circular definition—not the Government. We are looking to make sure that it has that technology in place, but if we prescribed it in the Bill, it would undoubtedly be out of date within months, never mind years. That is why it is better for us to have a rounded approach, working with the technology sector, to ensure that it is robust enough.

Kit Malthouse Portrait Kit Malthouse
- Hansard - - - Excerpts

I may not have been clear in my original intervention: my concern is that the legislation relies on the same sector that has thus far failed to regulate itself and failed to invent the technology that is required, even though it is probably perfectly capable of doing so, to produce the technology that we will then accredit to be used. My worry is that the sector, for one reason or another—the same reason that it has not moved with alacrity already to deal with these problems in the 15 years or so that it has existed—may not move at the speed that the Minister or the rest of us require to produce the technology for accreditation. What happens if it does not?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Clearly, the Government can choose to step in. We are setting up a framework to ensure that we get the right balance and are not being prescriptive. I take issue with the idea that a lot of this stuff has not been invented, because there is some pretty robust work on age assurance and verification, and other measures to identify harmful and illegal material, although my right hon. Friend is right that it is not being used as robustly as it could be. That is exactly what we are addressing in the Bill.

15:44
David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- Hansard - - - Excerpts

My intervention is on the same point as that raised by my right hon. Friend the Member for North West Hampshire (Kit Malthouse), but from the opposite direction, in effect. What if it turns out that, as many security specialists and British leaders in security believe—not just the companies, but professors of security at Cambridge and that sort of thing—it is not possible to implement such measures without weakening encryption? What will the Minister’s Bill do then?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The Bill is very specific with regard to encryption; this provision will cover solely CSEA and terrorism. It is important that we do not encroach on privacy.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

I welcome my hon. Friend to his position. Under the Bill, is it not the case that if a company refuses to use existing technologies, that will be a failure of the regulatory duties placed on that company? Companies will be required to demonstrate which technology they will use and will have to use one that is available. On encrypted messaging, is it not the case that companies already gather large amounts of information about websites that people visit before and after they send a message that could be hugely valuable to law enforcement?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

My hon. Friend is absolutely right. Not only is it incumbent on companies to use that technology should it exist; if they hamper Ofcom’s inquiries by not sharing information about what they are doing, what they find and which technologies they are not using, that will be a criminal liability under the Bill.

Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- Hansard - - - Excerpts

To take that one step further, is it correct that Ofcom would set minimum standards for operators? For example, the Content Authenticity Initiative does not need primary legislation, but is an industry open-standard, open-source format. That is an example of modern technology that all companies could sign up to use, and Ofcom would therefore determine what needs to be done in primary legislation.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

Can I be helpful? We did say that our discussions should be within scope, but the Minister is tempting everybody to intervene out of scope. From his own point of view, I would have thought that it would be easier to keep within scope.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Thank you, Mr Speaker; I will just respond to my hon. Friend the Member for Bosworth (Dr Evans). There is a minimum standard in so far as the operators have to adhere to the terms of the Bill. Our aim is to exclude illegal content and ensure that children are as safe as possible within the remit of the Bill.

The changes will ensure a flexible approach so that companies can use their expertise to develop or source the most effective solution for their service, rather than us being prescriptive. That, in turn, supports the continued growth of our digital economy while keeping our citizens safe online.

Sajid Javid Portrait Sajid Javid (Bromsgrove) (Con)
- Hansard - - - Excerpts

My hon. Friend may know that there are third-party technology companies—developers of this accredited technology, as he calls it—that do not have access to all the data that might be necessary to develop technology to block the kind of content we are discussing. They need to be given the right to access that data from the larger platforms. Will Ofcom be able to instruct large platforms that have users’ data to make it available to third-party developers of technology that can help to block such content?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Ofcom will be working with the platforms over the next few months—in the lead-up to the commencement of the Bill and afterwards—to ensure that the provisions are operational, so that we get them up and running as soon as practicably possible. My right hon. Friend is right to raise the point.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

In Northern Ireland we face the specific issue of the glorification of terrorism. Glorifying terrorism encourages terrorism. Is it possible that the Bill will stop that type of glorification, and therefore stop the terrorism that comes off the back of it?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will try to cover the hon. Member’s comments a little bit later, if I may, when I talk about some of the changes coming up later in the process.

Moving away from CSEA, I am pleased to say that new clause 53 fulfils a commitment given by my predecessor in Committee to bring forward reforms to address epilepsy trolling. It creates the two specific offences of sending and showing flashing images to an individual with epilepsy with the intention of causing them harm. Those offences will apply in England, Wales and Northern Ireland, providing people with epilepsy with specific protection from this appalling abuse. I would like to place on record our thanks to the Epilepsy Society for working with the Ministry of Justice to develop the new clause.

The offence of sending flashing images captures situations in which an individual sends a communication in a scatter-gun manner—for example, by sharing a flashing image on social media—and the more targeted sending of flashing images to a person who the sender knows or suspects is a person with epilepsy. It can be committed by a person who forwards or shares such an electronic communication as well as by the person sending it. The separate offence of showing flashing images will apply if a person shows flashing images to someone they know or suspect to have epilepsy by means of an electronic communications device—for example, on a mobile phone or a TV screen.

The Government have listened to parliamentarians and stakeholders about the impact and consequences of this reprehensible behaviour, and my thanks go to my hon. Friends the Members for Watford (Dean Russell), for Stourbridge (Suzanne Webb), for Blackpool North and Cleveleys (Paul Maynard) and for Ipswich (Tom Hunt) for their work and campaigning. [Interruption.] Indeed, and the hon. Member for Batley and Spen (Kim Leadbeater), who I am sure will be speaking on this later.

New clause 53 creates offences that are legally robust and enforceable so that those seeking to cause harm to people with epilepsy will face appropriate criminal sanctions. I hope that will reassure the House that the deeply pernicious activity of epilepsy trolling will be punishable by law.

Suzanne Webb Portrait Suzanne Webb (Stourbridge) (Con)
- Hansard - - - Excerpts

The Minister is thanking lots of hon. Members, but should not the biggest thanks go, first, to the Government for the inclusion of this amendment; and secondly, to Zach Eagling, the inspirational now 11-year-old who was the victim of a series of trolling incidents when flashing images were pushed his way after a charity walk? We have a huge amount to thank Zach Eagling for, and of course the amazing Epilepsy Society too.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

A number of Members across the House have been pushing for Zach’s law, and I am really delighted that Zach’s family can see in Hansard that that campaigning has really made a direct change to the law.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

I just want to echo the previous points. This has been a hard-fought decision, and I am so proud that the Government have done this, but may I echo the thanks to Zach for being a true hero? We talk about David and Goliath, the giant—the beast—who was taken down, but Zach has beaten the tech giants, and I think this is an incredible success.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I absolutely echo my hon. Friend’s remarks, and I again thank him for his work.

We are also taking steps to strengthen Ofcom’s enforcement powers, which is why we are giving Ofcom a discretionary power to require non-compliant services to publish or notify their users of enforcement action that it has taken against the service. Ofcom will be able to use this power to direct a service to publish details or notify its UK users about enforcement notices it receives from Ofcom. I thank the Antisemitism Policy Trust for bringing this proposal to our attention and for its helpful engagement on the issue. This new power will promote transparency by increasing awareness among users about breaches of the duty in the Bill. It will help users make much more informed decisions about the services they use, and act as an additional deterrent factor for service providers.

Luke Evans Portrait Dr Luke Evans
- Hansard - - - Excerpts

It is fantastic to have the data released. Does the Minister have any idea how many of these notifications are likely to be put out there when the Bill comes in? Has any work been done on that? Clearly, having thousands of these come out would be very difficult for the public to understand, but half a dozen over a year might be very useful to understand which companies are struggling.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I think this is why Ofcom has discretion, so that it can determine that. The most egregious examples are the ones people can learn from, and it is about doing this in proportion. My hon. Friend is absolutely right that if we are swamped with small notifications, this will be hidden in plain sight. That would not be useful, particularly for parents, to best understand what is going on. It is all about making more informed decisions.

The House will be aware that we recently announced our intention to make a number of other changes to the Bill. We are making those changes because we believe it is vital that people can continue to express themselves freely and engage in pluralistic debate online. That is why the Bill will be amended to strengthen its provisions relating to children and to ensure that the Bill’s protections for adults strike the right balance with its protections for free speech.

Margaret Hodge Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

The Minister is alluding, I assume, to the legal but harmful provision, but what does he think about this as an example? People are clever; they do not use illegal language. They will not say, “I want to kill all Jews”, but they may well—and do—say, “I want to harm all globalists.” What is the Minister’s view of that?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The right hon. Lady and I have had a detailed chat about some of the abuse that she and many others have been suffering, and there were some particularly egregious examples. This Bill is not, and never will be, a silver bullet. This has to be worked through, with the Government acting with media platforms and social media platforms, and parents also have a role. This will evolve, but we first need to get back to the fundamental point that social media platforms are not geared up to enforce their own terms and conditions. That is ridiculous, a quarter of a century after the world wide web kicked in, and when social media platforms have been around for the best part of 20 years. We are shutting the stable door afterwards, and trying to come up with legislation two decades later.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

Order. I am really bothered. I am trying to help the Minister, because although broadening discussion of the Bill is helpful, it is also allowing Members to come in with remarks that are out of scope. If we are going to go out of scope, we could be here a long time. I am trying to support the Minister by keeping him in scope.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Thank you, Mr Speaker; I will try to keep my remarks very much in scope.

The harmful communications offence in clause 151 was a reform to communication offences proposed in the Bill. Since the Bill has been made public, parliamentarians and stakeholders have expressed concern that the threshold that would trigger prosecution for the offence of causing serious distress could bring robust but legitimate conversation into the illegal space. In the light of that concern, we have decided not to take forward the harmful communications offence for now. That will give the Government an opportunity to consider further how the criminal law can best protect individuals from harmful communications, and ensure that protections for free speech are robust.

Jim Shannon Portrait Jim Shannon
- Hansard - - - Excerpts

This is about the protection of young people, and we are all here for the same reason, including the Minister. We welcome the changes that he is putting forward, but the Royal College of Psychiatrists has expressed a real concern about the mental health of children, and particularly about how screen time affects them. NHS Digital has referred to one in eight 11 to 16-year-olds being bullied. I am not sure whether we see in the Bill an opportunity to protect them, so perhaps the Minister can tell me the right way to do that.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

The hon. Gentleman talks about the wider use of screens and screen time, and that is why Ofcom’s media literacy programme, and DCMS’s media literacy strategy—

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

That is because we have a detailed strategy that tackles many of these issues. Again, none of this is perfect, and as I have said, the Government are working in tandem with the platforms, and with parents and education bodies, to make sure we get that bit right. The hon. Gentleman is right to highlight that as a big issue.

I talked about harmful communications, recognising that we could leave a potential gap in the criminal law. The Government have also decided not to repeal existing communications offences in the Malicious Communications Act 1988, or those under section 127(1) of the Communications Act 2003. That will ensure that victims of domestic abuse or other extremely harmful communications will still be robustly protected by the criminal law. Along with planned changes to the harmful communications offence, we are making a number of additional changes to the Bill—that will come later, Mr Speaker, and I will not tread too much into that, as it includes the removal of the adult safety duties, often referred to as the legal but harmful provision. The amended Bill offers adults a triple shield of protection that requires platforms to remove illegal content and material that violates their terms and conditions, and gives adults user controls to help them avoid seeing certain types of content.

The Bill’s key objective, above everything else, is the safety of children online, and we will be making a number of changes to strengthen the Bill’s existing protections for children. We will make sure that we expect platforms to use age assurance technology when identifying the age of their users, and we will also require platforms with minimum age restrictions to explain in their terms of service what measures they have in place to prevent access to those below their minimum age, and enforce those measures consistently. We are planning to name the Children’s Commissioner as a statutory consultee for Ofcom in its development of the codes of practice, ensuring that children’s views and needs are represented.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

That is the Children’s Commissioner for England, specifically because they have particular reserved duties for the whole of the UK. None the less, Ofcom must also have regard to a wider range of voices, which can easily include the other Children’s Commissioners.

16:00
Mike Amesbury Portrait Mike Amesbury (Weaver Vale) (Lab)
- Hansard - - - Excerpts

On age reassurance, does the Minister not see a weakness? Lots of children and young people are far more sophisticated than many of us in the Chamber and will easily find a workaround, as they do now. The onus is being put on the children, so the Bill is not increasing regulation or the safety of those children.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

As I said, the social media platforms will have to put in place robust age assurance and age verification for material in an accredited form that is acceptable to Ofcom, which will look at that.

Tackling violence against women and girls is a key priority for the Government. It is unacceptable that women and girls suffer disproportionately from abuse online, and it is right that we go further to address that through the Bill. That is why we will name the commissioner for victims and witnesses and the Domestic Abuse Commissioner as statutory consultees for the code of practice and list “coercive or controlling behaviour” as a priority offence. That offence disproportionately affects women and girls, and that measure will mean that companies will have to take proactive measures to tackle such content.

Finally, we are making a number of criminal law reforms, and I thank the Law Commission for the great deal of important work that it has done to assess the law in these areas.

Ruth Edwards Portrait Ruth Edwards (Rushcliffe) (Con)
- Hansard - - - Excerpts

I strongly welcome some of the ways in which the Bill has been strengthened to protect women and girls, particularly by criminalising cyber-flashing, for example. Does the Minister agree that it is vital that our laws keep pace with the changes in how technology is being used? Will he therefore assure me that the Government will look to introduce measures along the lines set out in new clauses 45 to 50, standing in the name of my right hon. Friend the Member for Basingstoke (Dame Maria Miller), who is leading fantastic work in this area, so that we can build on the Government’s record in outlawing revenge porn and threats to share it?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my hon. Friend, and indeed I thank my right hon. Friend the Member for Basingstoke (Dame Maria Miller) for the amazing work that she has done in this area. We will table an amendment to the Bill to criminalise more behaviour relating to intimate image abuse, so more perpetrators will face prosecution and potentially time in jail. My hon. Friend has worked tirelessly in this area, and we have had a number of conversations. I thank her for that. I look forward to more conversations to ensure that we get the amendment absolutely right and that it does exactly what we all want.

The changes we are making will include criminalising the non-consensual sharing of manufactured intimate images, which, as we have heard, are more commonly known as deepfakes. In the longer term, the Government will also take forward several of the Law Commission’s recommendations to ensure that the legislation is coherent and takes account of advancements in technology.

We will also use the Bill to bring forward a further communication offence to make the encouragement of self-harm illegal. We have listened to parliamentarians and stakeholders concerned about such behaviour and will use the Bill to criminalise that activity, providing users with protections from that harmful content. I commend my right hon. Friend the Member for Haltemprice and Howden on his work in this area and his advocacy for such a change.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

Intimate image abuse has been raised with me a number of times by younger constituents, who are particularly vulnerable to such abuse. Within the scope of what we are discussing, I am concerned that we have seen only one successful conviction for revenge porn, so if the Government base their intimate image work on the existing legislative framework for revenge porn, it will do nothing and protect no one, and will instead be a waste of everyone’s time and further let down victims who are already let down by the system.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

We will actually base that work on the independent Law Commission’s recommendations, and have been working with it on that basis.

Vicky Ford Portrait Vicky Ford (Chelmsford) (Con)
- Hansard - - - Excerpts

On images that promote self-harm, does the Minister agree that images that promote or glamourise eating disorders should be treated just as seriously as any other content promoting self-harm?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I thank my right hon. Friend, who spoke incredibly powerfully at Digital, Culture, Media and Sport questions, and on a number of other occasions, about her particular experience. That is always incredibly difficult. Absolutely that area will be tackled, especially for children, but it is really important—as we will see from further changes in the Bill—that, with the removal of the legal but harmful protections, there are other protections for adults.

Sajid Javid Portrait Sajid Javid
- Hansard - - - Excerpts

I think last year over 6,000 people died from suicide in the UK. Much of that, sadly, was encouraged by online content, as we saw from the recent coroner’s report into the tragic death of Molly Russell. On new clause 16, tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), will the Minister confirm that the Government agree with the objectives of new clause 16 and will table an amendment to this Bill—to no other parliamentary vehicle, but specifically to this Bill—to introduce such a criminal offence? Will the Government amendment he referred to be published before year end?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

On self-harm, I do not think there is any doubt that we are absolutely aligned. On suicide, I have some concerns about how new clause 16 is drafted—it amends the Suicide Act 1961, which is not the right place to introduce measures on self-harm—but I will work to ensure we get this measure absolutely right as the Bill goes through the other place.

Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

Will my hon. Friend give way?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will give way first to one of my predecessors.

Caroline Dinenage Portrait Dame Caroline Dinenage
- Hansard - - - Excerpts

I thank my hon. Friend for giving way. He is almost being given stereo questions from across the House, but I think they might be slightly different. I am very grateful to him for setting out his commitment to tackling suicide and self-harm content, and for his commitment to my right hon. Friend the Member for Chelmsford (Vicky Ford) on eating disorder content. My concern is that there is a really opaque place in the online world between what is legal and illegal, which potentially could have been tackled by the legal but harmful restrictions. Can he set out a little more clearly—not necessarily now, but as we move forward—how we really are going to begin to tackle the opaque world between legal and illegal content?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

If my hon. Friend will bear with me—I need to make some progress—I think that will be teased out today and in Committee, should the Bill be recommitted, as we amend the clauses relating directly to what she is talking about, and then as the Bill goes through the other place.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

Will the Minister give way?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will give way a final time before I finish.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

I am grateful to the Minister, who has taken a number of interventions. I fully agree with my hon. Friend the Member for Gosport (Dame Caroline Dinenage). This is a grey area and has consistently been so—many Members have given their views on that in previous stages of the Bill. Will the Minister come back in the later stages on tackling violence against women and girls, and show how the Bill will incorporate key aspects of the Domestic Abuse Act 2021, and tie up with the criminal justice system and the work of the forthcoming victims Bill? We cannot look at these issues in isolation—I see that the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar) is also on the Front Bench. Rather, they all have to be put together in a golden thread of protecting victims, making sure that people do not become victims, and ensuring that we go after the perpetrators—we must not forget that at all. The Minister will not be able to answer that now, but I would ask him to please do so in the latter stages.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I talked about the fact that the Commissioner for Victims and Witnesses and the Domestic Abuse Commissioner will be statutory consultees, because it is really important that their voice is heard in the implementation of the Bill. We are also bringing in coercive control as one of the areas. That is so important when it comes to domestic abuse. Domestic abuse does not start with a slap, a hit, a punch; it starts with emotional abuse—manipulation, coercion and so on. That is why coercive abuse is an important point not just for domestic abuse, but for bullying, harassment and the wider concerns that the Bill seeks to tackle.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will give way and then finish up.

Jamie Stone Portrait Jamie Stone
- Hansard - - - Excerpts

I am one of three Scottish Members present, and the Scottish context concerns me. If time permits me in my contribution later, I will touch on a particularly harrowing case. The school involved has been approached but has done nothing. Education is devolved, so the Minister may want to think about that. It would be too bad if the Bill failed in its good intentions because of a lack of communication in relation to a function delivered by the Scottish Government. Can I take it that there will be the closest possible co-operation with the Scottish Government because of their educational responsibilities?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

There simply has to be. These are global companies and we want to make the Bill work for the whole of the UK. This is not an England-only Bill, so the changes must happen for every user, whether they are in Scotland, Northern Ireland, Wales or England.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will make a bit of progress, because I am testing Mr Speaker’s patience.

We are making a number of technical amendments to ensure that the new communications offences are targeted and effective. New clause 52 seeks to narrow the exemptions for broadcast and wireless telegraphy licence holders and providers of on-demand programme services, so that the licence holder is exempt only to the extent that communication is within the course of a licensed activity. A separate group of technical amendments ensure that the definition of sending false and threatening communications will capture all circumstances—that is far wider than we have at the moment.

We propose a number of consequential amendments to relevant existing legislation to ensure that new offences operate consistently with the existing criminal law. We are also making a number of wider technical changes to strengthen the enforcement provisions and ensure consistency with other regulatory frameworks. New clause 42 ensures that Ofcom has the power to issue an enforcement notice to a former service provider, guarding against service providers simply shutting down their business and reappearing in a slightly different guise to avoid regulatory sanction. A package of Government amendments will set out how the existing video-sharing platform regime will be repealed and the transitional provisions that will apply to those providers as they transition to the online safety framework.

Finally, new clause 40 will enable the CMA to share information with Ofcom for the purpose of facilitating Ofcom’s online safety functions. That will help to ensure effective co-operation between Ofcom and the CMA.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I thank my hon. Friend for giving way. In the past 40 minutes or so, he has demonstrated the complexity of the changes that are being proposed for the Bill, and he has done a very good job in setting that out. However, will he join me and many other right hon. and hon. Members who feel strongly that a Standing Committee should look at the Bill’s implementation, because of the complexities that he has so clearly demonstrated? I know that is a matter for the House rather than our consideration of the Bill, but I hope that other right hon. and hon. Members will join me in looking for ways to put that right. We need to be able to scrutinise the measures on an ongoing basis.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

Indeed, there will be, and are, review points in the Bill. I have no doubt that my right hon. Friend will raise that on other occasions as well.

I want to ensure that there is plenty of time for Members to debate the Bill at this important stage, and I have spoken for long enough. I appreciate the constructive and collaborative approach that colleagues have taken throughout the Bill’s passage.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I will give way a final time.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

I am grateful to the Minister. Does he support Baroness Kidron’s amendment asking for swift, humane access to data where there is a suspicion that online information may have contributed to a child’s suicide? That has not happened in previous instances; does he support that important amendment?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am glad that I gave way so that the hon. Lady could raise that point. Baroness Kidron and her organisation have raised that issue with me directly, and they have gathered media support. We will look at that as the Bill goes through this place and the Lords, because we need to see what the powers are at the moment and why they are not working.

Now is the time to take this legislation forward to ensure that it can deliver the safe and transparent online environment that children and adults so clearly deserve.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

I call the shadow Minister.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is an absolute pleasure to be back in the Chamber to respond on behalf of the Opposition to this incredibly important piece of legislation on its long overdue second day on Report. It certainly has not been an easy ride so far: I am sure that Bill Committee colleagues across the House agree that unpicking and making sense of this unnecessarily complicated Bill has been anything but straightforward.

We should all be incredibly grateful and are all indebted to the many individuals, charities, organisations and families who have worked so hard to bring online safety to the forefront for us all. Today is a particularly important day, as we are joined in the Public Gallery by a number of families who have lost children in connection with online harms. They include Lorin LaFave, Ian Russell, Andy and Judy Thomas, Amanda and Stuart Stephens and Ruth Moss. I sincerely hope that this debate will do justice to their incredible hard work and commitment in the most exceptionally difficult of circumstances.

16:14
We must acknowledge that the situation has been made even harder by the huge changes that we have seen in the Government since the Bill was first introduced. Since its First Reading, it has been the responsibility of three different Ministers and two Secretaries of State. Remarkably, it has seen three Prime Ministers in post, too. We can all agree that legislation that will effectively keep people safe online urgently needs to be on the statute book: that is why Labour has worked hard and will continue to work hard to get the Bill over the line, despite the best efforts of this Government to kick the can down the road.
The Government have made a genuine mess of this important legislation. Before us today are a huge number of new amendments tabled by the Government to their own Bill. We now know that the Government also plan to recommit parts of their own Bill—to send them back into Committee, where the Minister will attempt to make significant changes that are likely to damage even further the Bill’s ability to properly capture online harm.
We need to be moving forwards, not backwards. With that in mind, I am keen to speak to a number of very important new clauses this afternoon. I will first address new clause 17, which was tabled by my right hon. Friend the Member for Barking (Dame Margaret Hodge), who has been an incredibly passionate and vocal champion for internet regulation for many years.
As colleagues will be aware, the new clause will fix the frustrating gaps in Ofcom’s enforcement powers. As the Bill stands, it gives Ofcom the power to fine big tech companies only 10% of their turnover for compliance failures. It does not take a genius to recognise that that can be a drop in the ocean for some of the global multimillionaires and billionaires whose companies are often at the centre of the debate around online harm. That is why the new clause, which will mean individual directors, managers or other officers finally being held responsible for their compliance failures, is so important. When it comes to responsibilities over online safety, it is clear that the Bill needs to go further if the bosses in silicon valley are truly to sit up, take notice and make positive and meaningful changes.
Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- Hansard - - - Excerpts

I am afraid I cannot agree with the hon. Lady that the fines would be a drop in the ocean. These are very substantial amounts of money. In relation to individual director liability, I completely understand where the right hon. Member for Barking (Dame Margaret Hodge) is coming from, and I support a great deal of what she says. However, there are difficulties with the amendment. Does the hon. Member for Pontypridd (Alex Davies-Jones) accept that it would be very odd to end up in a position in which the only individual director liability attached to information offences, meaning that, as long as an individual director was completely honest with Ofcom about their wrongdoing, they would attract no individual liability?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It may be a drop in the ocean to the likes of Elon Musk or Mark Zuckerberg—these multibillionaires who are taking over social media and using it as their personal plaything. They are not going to listen to fines; the only way they are going to listen, sit up and take notice is if criminal liability puts their neck on the line and makes them answer for some of the huge failures of which they are aware.

The right hon. and learned Member mentions that he shares the sentiment of the amendment but feels it could be wrong. We have an opportunity here to put things right and put responsibility where it belongs: with the tech companies, the platforms and the managers responsible. In a similar way to what happens in the financial sector or in health and safety regulation, it is vital that people be held responsible for issues on their platforms. We feel that criminal liability will make that happen.

David Davis Portrait Mr David Davis
- Hansard - - - Excerpts

May I intervene on a point of fact? The hon. Lady says that fines are a drop in the ocean. The turnover of Google is $69 billion; 10% of that is just shy of $7 billion. That is not a drop in the ocean, even to Elon Musk.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

We are looking at putting people on the line. It needs to be something that people actually care about. Money does not matter to these people, as we have seen with the likes of Google, Elon Musk and Mark Zuckerberg; what matters to them is actually being held to account. Money may matter to Government Members, but it will be criminal liability that causes people to sit up, listen and take responsibility.

While I am not generally in the habit of predicting the Minister’s response or indeed his motives—although my job would be a hell of a lot easier if I did—I am confident that he will try to peddle the line that it was the Government who introduced director liability for compliance failures in an earlier draft of the Bill. Let me be crystal clear in making this point, because it is important. The Bill, in its current form, makes individuals at the top of companies personally liable only when a platform fails to supply information to Ofcom, which misses the point entirely. Directors must be held personally liable when safety duties are breached. That really is quite simple, and I am confident that it would be effective in tackling harm online much more widely.

We also support new clause 28, which seeks to establish an advocacy body to represent the interests of children online. It is intended to deal with a glaring omission from the Bill, which means that children who experience online sexual abuse will receive fewer statutory user advocacy protections than users of a post office or even passengers on a bus. The Minister must know that that is wrong and, given his Government’s so-called commitment to protecting children, I hope he will carefully consider a new clause which is supported by Members on both sides of the House as well as the brilliant National Society for the Prevention of Cruelty to Children. In rejecting new clause 28, the Government would be denying vulnerable children a strong, authoritative voice to represent them directly, so I am keen to hear the Minister’s justification for doing so, if that is indeed his plan.

Members will have noted the bundle of amendments tabled by my hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) relating to Labour’s concerns about the unnecessary powers to overrule Ofcom that the Bill, as currently drafted, gives the Secretary of State of the day. During Committee evidence sessions, we heard from Will Perrin of the Carnegie UK Trust, who, as Members will know, is an incredibly knowledgeable voice when it comes to internet regulation. He expressed concern about the fact that, in comparison with other regulatory frameworks such as those in place for advertising, the Bill

“goes a little too far in introducing a range of powers for the Secretary of State to interfere with Ofcom’s day-to-day doing of its business.”––[Official Report, Online Safety Public Bill Committee, 26 May 2022; c. 117.]

Labour shares that concern. Ofcom must be truly independent if it is to be an effective regulator. Surely we have to trust it to undertake logical processes, rooted in evidence, to arrive at decisions once this regime is finally up and running. It is therefore hard to understand how the Government can justify direct interference, and I hope that the Minister will seriously consider amendments 23 to 30, 32, and 35 to 41.

Before I address Labour’s main concerns about the Government’s proposed changes to the Bill, I want to record our support for new clauses 29 and 30, which seek to bring media literacy duties back into the scope of the Bill. As we all know, media literacy is the first line of defence when it comes to protecting ourselves against false information online. Prevention is always better than cure. Whether it is a question of viral conspiracy theories or Russian disinformation, Labour fears that the Government’s approach to internet regulation will create a two-tier internet, leaving some more vulnerable than others.

However, I am sorry to say that the gaps in this Bill do not stop there. I was pleased to see that my hon. Friend the Member for Rotherham (Sarah Champion) had tabled new clause 54, which asks the Government to formally consider the impact that the use of virtual private networks will have on Ofcom’s ability to enforce its powers. This touches on the issue of future-proofing, which Labour has raised repeatedly in debates on the Bill. As we have heard from a number of Members, the tech industry is evolving rapidly, with concepts such as the metaverse changing the way in which we will all interact with the internet in the future. When the Bill was first introduced, TikTok was not even a platform. I hope the Minister can reassure us that the Bill will be flexible enough to deal with those challenges head-on; after all, we have waited far too long.

That brings me to what Labour considers to be an incredible overturn by the Government relating to amendment 239, which seeks to remove the new offence of harmful communications from the Bill entirely. As Members will know, the communications offence was designed by the Law Commission with the intention of introducing a criminal threshold for the most dangerous online harms. Indeed, in Committee it was welcome to hear the then Minister—the present Minister for Crime, Policing and Fire, the right hon. Member for Croydon South (Chris Philp)—being so positive about the Government’s consultation with the commission. In relation to clause 151, which concerns the communications offences, he even said:

“The Law Commission is the expert in this kind of thing…and it is right that, by and large, we follow its expert advice in framing these offences, unless there is a very good reason not to. That is what we have done—we have followed the Law Commission’s advice, as we would be expected to do.” ––[Official Report, Online Safety Public Bill Committee, 21 June 2022; c. 558.]

Less than six months down the line, we are seeing yet another U-turn from this Government, who are doing precisely the opposite of what was promised.

Removing these communications offences from the Bill will have real-life consequences. It will mean that harmful online trends such as hoax bomb threats, abusive social media pile-ons and fake news such as encouraging people to drink bleach to cure covid will be allowed to spread online without any consequence.

Christian Wakeford Portrait Christian Wakeford (Bury South) (Lab)
- Hansard - - - Excerpts

No Jewish person should have to log online and see Hitler worship, but what we have seen in recent weeks from Kanye West has been nothing short of disgusting, from him saying “I love Hitler” to inciting online pile-ons against Jewish people, and this is magnified by the sheer number of his followers, with Jews actually being attacked on the streets in the US. Does my hon. Friend agree that the Government’s decision to drop the “legal but harmful” measures from the Bill will allow this deeply offensive and troubling behaviour to continue?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank my hon. Friend for that important and powerful intervention. Let us be clear: everything that Kanye West said online is completely abhorrent and has no place in our society. It is not for any of us to glorify Hitler and his comments or praise him for the work he did; that is absolutely abhorrent and it should never be online. Sadly, however, that is exactly the type of legal but harmful content that will now be allowed to proliferate online because of the Government’s swathes of changes to the Bill, meaning that that would be allowed to be seen by everybody. Kanye West has 30 million followers online. His followers will be able to look at, share, research and glorify that content without any consequence to that content being freely available online.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Further to that point, it is not just that some of the content will be deeply offensive to the Jewish community; it could also harm wider society. Some further examples of postings that would be considered legal but harmful are likening vaccination efforts to Nazi death camps and alleging that NHS nurses should stand trial for genocide. Does my hon. Friend not agree that the changes the Government are now proposing will lead to enormous and very damaging impacts right through society?

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My right hon. Friend is absolutely right. I am keen to bring this back into scope before Mr Speaker chastises us any further, but she is right to say that this will have a direct real-world impact. This is what happens when we focus on content rather than directly on the platforms and the algorithms on the platforms proliferating this content. That is where the focus needs to be. It is the algorithms that share and amplify this content to these many followers time and again that need to be tackled, rather than the content itself. That is what we have been pleading with the Government to concentrate on, but here we are in this mess.

We are pleased that the Government have taken on board Labour’s policy to criminalise certain behaviours—including the encouragement of self-harm, sharing people’s intimate images without their consent, and controlling or coercive behaviours—but we believe that the communications offences more widely should remain in order to tackle dangerous online harms at their root. We have worked consistently to get this Bill over the line and we have reached out to do so. It has been subject to far too many delays and it is on the Government’s hands that we are again facing substantial delays, when internet regulation has never been more sorely needed. I know that the Minister knows that, and I sincerely hope he will take our concerns seriously. I reach out to him again across the Dispatch Box, and look forward to working with him and challenging him further where required as the Bill progresses. I look forward to getting the Bill on to the statute book.

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

I call the Chair of the Select Committee.

Julian Knight Portrait Julian Knight (Solihull) (Con)
- View Speech - Hansard - - - Excerpts

I welcome the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), to his place. To say that he has been given a hospital pass in terms of this legislation is a slight understatement. It is very difficult to understand, and the ability he has shown at the Dispatch Box in grasping many of the major issues is to his credit. He really is a safe pair of hands and I thank him for that.

Looking at the list of amendments, I think it is a bit of a hotchpotch, yet we are going to deal only with certain amendments today and others are not in scope. That shows exactly where we are with this legislation. We have been in this stasis now for five years. I remember that we were dealing with the issue when I joined the Digital, Culture, Media and Sport Committee, and it is almost three years since the general election when we said we would bring forward this world-leading legislation. We have to admit that is a failure of the political class in all respects, but we have to understand the problem and the realities facing my hon. Friend, other Ministers and the people from different Departments involved in drafting this legislation.

We are dealing with companies that are more powerful than the oil barons and railway barons of the 19th century. These companies are more important than many states. The total value of Alphabet, for instance, is more than the total GDP of the Netherlands, and that is probably a low estimate of Alphabet’s global reach and power. These companies are, in many respects, almost new nation states in their power and reach, and they have been brought about by individuals having an idea in their garage. They still have that culture of having power without the consequences that flow from it.

16:30
These companies have created wonderful things that enhance our lives in many respects through better communication and increased human knowledge, which we can barely begin to imagine, but they have done it with a skater boy approach—the idea that they are beyond the law. They had that enshrined in law in the United States, where they have effectively become nothing more than a megaphone or a noticeboard, and they have always relied on that. They are based or domiciled, in the main, in the United States, which is where they draw their legal power. They will always be in that position of power.
We talk about 10% fines and even business interruption to ensure these companies have skin in the game, but we have to realise these businesses are so gigantic and of such importance that they could simply ignore what we do in this place. Will we really block a major social media platform? The only time something like that has been done was when a major social media platform blocked a country, if I remember rightly. We have to understand where we are coming from in that respect.
This loose cannon, Elon Musk, is an enormously wealthy man, and he is quite strange, isn’t he? He is intrinsically imbued with the power of silicon valley and those new techno-masters of the universe. We are dealing with those realities, and this Bill is very imperfect.
David Davis Portrait Mr David Davis
- Hansard - - - Excerpts

My hon. Friend is giving a fascinating disquisition on this industry, but is not the implication that, in effect, these companies are modern buccaneer states and we need to do much more to legislate? I am normally a deregulator, but we need more than one Bill to do what we seek to do today.

Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

My right hon. Friend is correct. We spoke privately before this debate, and he said this is almost five Bills in one. There will be a patchwork of legislation, and there is a time limit. This is a carry-over Bill, and we have to get it on the statute book.

This Bill is not perfect by any stretch of the imagination, and I take the Opposition’s genuine concerns about legal but harmful material. The shadow Minister mentioned the tragic case of Molly Russell. I heard her father being interviewed on the “Today” programme, and he spoke about how at least three quarters of the content he had seen that had prompted that young person to take her life had been legal but harmful. We have to stand up, think and try our best to ensure there is a safer space for young people. This Bill does part of that work, but only part. The work will be done in the execution of the Bill, through the wording on age verification and age assurance.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Given the complexities of the Bill, and given the Digital, Culture, Media and Sport Committee’s other responsibilities, will my hon. Friend join me in saying there should be a special Committee, potentially of both Houses, to keep this area under constant review? That review, as he says, is so badly needed.

Julian Knight Portrait Julian Knight
- Hansard - - - Excerpts

I thank my right hon. Friend for her question, which I have previously addressed. The problem is the precedent it would set. Any special Committee set up by a Bill would be appointed by the Whips, so we might as well forget about the Select Committee system. This is not a huge concern for the Digital, Culture, Media and Sport Committee, because the advent of any such special Committee would probably be beyond the next general election, and I am not thinking to that timeframe. I am concerned about the integrity of Parliament. The problem is that if we do that in this Bill, the next Government will come along and do it with another Bill and then another Bill. Before we know it, we will have a Select Committee system that is Whips-appointed and narrow in definition, and that cuts across something we all vote for.

There are means by which we can have legislative scrutiny—that is the point I am making in my speech. I would very much welcome a Committee being set up after a year, temporarily, to carry out post-legislative scrutiny. My Committee has a Sub-Committee on disinformation and fake news, which could also look at this Bill going forward. So I do not accept my right hon. Friend’s point, but I appreciate completely the concerns about our needing proper scrutiny in this area. We must also not forget that any changes to Ofcom’s parameters can be put in a statutory instrument, which can by prayed against by the Opposition and thus we would have the scrutiny of the whole House in debate, which is preferable to having a Whips-appointed Committee.

I have gone into quite a bit of my speech there, so I am grateful for that intervention in many respects. I am not going to touch on every aspect of this issue, but I urge right hon. and hon. Members in all parts of the House to think about the fact that although this is far from perfect legislation and it is a shame that we have not found a way to work through the legal but harmful material issue, we have to understand the parameters we are working in, in the real world, with these companies. We need to see that there is a patchwork of legislation, and the biggest way in which we can effectively let the social media companies know they have skin in the game in society—a liberal society that created them—is through competition legislation, across other countries and other jurisdictions. I am talking about our friends in the European Union and in the United States. We are working together closely now to come up with a suite of competition legislation. That is how we will be able to cover off some of this going forward. I will be supporting this Bill tonight and I urge everyone to do so, because, frankly, after five years I have had enough.

John Nicolson Portrait John Nicolson
- View Speech - Hansard - - - Excerpts

I rise to speak to the amendments in my name and those of my right hon. and hon. Friends, which of course I support.

It is welcome to see the Online Safety Bill back in the House. As we have debated this Bill and nursed it, as in my case, through both the Bill Committee and the Joint Committee, we have shone a light into some dark corners and heard some deeply harrowing stories. Who can forget the testimony given to us by Molly Russell’s dad, Ian? As we have heard, in the Public Gallery we have bereaved families who have experienced the most profound losses due to the extreme online harms to which their loved ones have been exposed; representatives of those families are watching the proceedings today. The hon. Member for Pontypridd (Alex Davies-Jones) mentioned that Ian is here, but let me mention the names of the children. Amanda and Stuart Stephens are here, and they are the parents of Olly; Andy and Judy Thomas are here, and they are the parents of Frankie; and Lorin LaFave, the mother of Breck is here, as is Ruth Moss, the mother of Sophie. All have lost children in connection with online harms, and I extend to each our most sincere condolences, as I am sure does every Member of the House. We have thought of them time and time again during the passage of this legislation; we have thought about their pain. All of us hope that this Bill will make very real changes, and we keep in our hearts the memories of those children and other young people who have suffered.

In our debates and Committee hearings, we have done our best to harry the social media companies and some of their secretive bosses. They have often been hiding away on the west coast of the US, to emerge blinking into the gloomy Committee light when they have to answer some questions about their nefarious activities and their obvious lack of concern for the way in which children and others are impacted.

We have debated issues of concern and sometimes disagreement in a way that shows the occasional benefits of cross-House co-operation. I have been pleased to work with friends and colleagues in other parties at every stage of the Bill, not least on Zach’s law, which we have mentioned. The result is a basis of good, much-needed legislation, and we must now get it on to the statute book.

It is unfortunate that the Bill has been so long delayed, which has caused great stress to some people who have been deeply affected by the issues raised, so that they have sometimes doubted our good faith. These delays are not immaterial. Children and young teenagers have grown older in an online world full of self-harm—soon to be illegal harms, we hope. It is a world full of easy-to-access pornography with no meaningful age verification and algorithms that provide harmful content to vulnerable people.

I have been pleased to note that calls from Members on the SNP Benches and from across the House to ensure that specific protection is granted to women and girls online have been heeded. New communications offences on cyber-flashing and intimate image abuse, and similar offences, are to be incorporated. The requirements for Ofcom to consult with the Victims’ Commissioner and the Domestic Abuse Commissioner are very welcome. Reporting tools should also be more responsive.

New clause 28 is an important new clause that SNP Members have been proud to sponsor. It calls for an advocacy body to represent the interests of children. That is vital, because the online world that children experience is ever evolving. It is not the online world that we in this Chamber tend to experience, nor is it the one experienced by most members of the media covering the debate today. We need, and young people deserve, a dedicated and appropriately funded body to look out for them online—a strong, informed voice able to stand up to the representations of big tech in the name of young people. This will, we hope, ensure that regulators get it right when acting on behalf of children online.

I am aware that there is broad support for such a body, including from those on the Labour Benches. We on the SNP Benches oppose the removal of the aspect of the Bill related to legal but harmful material. I understand the free speech arguments, and I have heard Ministers argue that the Government have proposed alternative approaches, which, they say, will give users control over the content that they see online. But adults are often vulnerable, too. Removing measures from the Bill that can protect adults, especially those in a mental health spiral or with additional learning needs, is a dereliction of our duty. An on/off toggle for harmful content is a poor substitute for what was originally proposed.

The legal but harmful discussion was and is a thorny one. It was important to get the language of the Bill right, so that people could be protected from harm online without impinging on freedom of expression, which we all hold dear. However, by sending aspects of the Bill back to Committee, with the intention of removing the legal but harmful provisions, I fear that the Government are simply running from a difficult debate, or worse, succumbing to those who have never really supported this Bill—some who rather approve of the wild west, free-for-all internet. It is much better to rise to the challenge of resolving the conflicts, such as they are, between free speech and legal but harmful. I accept that the Government’s proposals around greater clarity and enforcement of terms and conditions and of transparency in reporting to Ofcom offer some mitigation, but not, in my view, enough.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Gentleman will remember that, when we served on the Joint Committee that scrutinised the draft Bill, we were concerned that the term “legal but harmful” was problematic and that there was a lack of clarity. We thought it would be better to have more clarity and enforcement based on priority illegal offences and on the terms of service. Does he still believe that, or has he changed his mind?

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

It is a fine debate. Like so much in legislation, there is not an absolute right and an absolute wrong. We heard contradictory evidence. It is important to measure the advantages and the disadvantages. I will listen to the rest of the debate very carefully, as I have done throughout.

As a journalist in a previous life, I have long been a proponent of transparency and open democracy—something that occasionally gets me into trouble. We on the SNP Benches have argued from the outset that the powers proposed for the Secretary of State are far too expensive and wide-reaching. That is no disrespect to the Minister or the new Secretary of State, but they will know that there have been quite a few Culture Secretaries in recent years, some more temperate than others.

In wishing to see a diminution of the powers proposed we find ourselves in good company, not least with Ofcom. I note that there have been some positive shifts in the proposals around the powers of the Secretary of State, allowing greater parliamentary oversight. I hope that these indicate a welcome acknowledgement that our arguments have fallen on fertile Government soil—although, of course, it could be that the Conservative Secretary of State realises that she may soon be the shadow Secretary of State and that it will be a Labour Secretary of State exercising the proposed powers. I hope she will forgive me for that moment’s cynicism.

16:45
As we have done throughout the progress of this Bill, the SNP will engage with the Government and our friends and colleagues on other Benches. We have worked hard on this Bill, as have so many other Members. In particular, I pay tribute to my friend the hon. Member for Folkestone and Hythe (Damian Collins), who I see sitting on the Back Benches after an all-too-short ministerial career. It has been a steep learning curve for us all. We have met some wonderful, motivated, passionate people, some with sad stories and some with inspiring stories. Let us do all we can to ensure that we do not let them down.
Priti Patel Portrait Priti Patel (Witham) (Con)
- View Speech - Hansard - - - Excerpts

Before I speak to specific clauses I pay tribute to all the campaigners, particularly the families who have campaigned so hard to give their loved ones a voice through this Bill and to change our laws. Having had some prior involvement in the early stages of this Bill three years ago as Home Secretary, I also pay tribute to many of the officials and Members of this House on both sides who have worked assiduously on the construction, development and advancement of this Bill. In particular, I pay tribute to my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and the work of the Joint Committee; when I was Home Secretary we had many discussions about this important work. I also thank the Minister for the assiduous way in which he has handled interventions and actually furthered the debate with this Bill. There are many Government Departments that have a raft of involvement and engagement.

The victims must be at the heart of everything that we do now to provide safeguards and protections. Children and individuals have lost their lives because of the online space. We know there is a great deal of good in the online space, but also a great deal of harm, and that must unite us all in delivering this legislation. We have waited a long time for this Bill, but we must come together, knowing that this is foundational legislation, which will have to be improved and developed alongside the technology, and that there is much more work to do.

I start by focusing on a couple of the new clauses, beginning with Government new clause 11 on end-to-end encryption. The House will not be surprised by my background in dealing with end-to-end encryption, particularly the harmful content, the types of individuals and the perpetrators who hide behind end-to-end encryption. We must acknowledge the individuals who harm children or who peddle terrorist content through end-to-end encryption while recognising that encryption services are important to protect privacy.

There is great justification for encryption—business transactions, working for the Government and all sorts of areas of importance—but we must acknowledge in this House that there is more work to do, because these services are being used by those who would do harm to our country, threaten our national interest or threaten the safety of young people and children in particular. We know for a fact that there are sick-minded individuals who seek to abuse and exploit children and vulnerable adults. The Minister will know that, and I am afraid that many of us do. I speak now as a constituency Member of Parliament, and one of my first surgery cases back in 2010 was the sad and tragic case of a mother who came to see me because her son had accessed all sorts of content. Thanks to the Bill, that content will now be ruled as harmful. There were other services associated with access that the family could not see and could not get access to, and encryption platforms are part of that.

There are shocking figures, and I suspect that many of my colleagues in the House will be aware of them. Almost 100,000 reports relating to online child abuse were received by UK enforcement agencies in 2021 alone. That is shocking. The House will recognise my experience of working with the National Crime Agency, to which we must pay tribute for its work in this space, as we should to law enforcement more widely. Police officers and all sorts of individuals in law enforcement are, day in, day out, investigating these cases and looking at some of the most appalling images and content, all in the name of protecting vulnerable children, and we must pay tribute to them as well.

It is also really shocking that that figure of 100,000 reports in 2021 alone is a 29% increase on the previous year. The amount of disturbing content is going up and up, and we are, I am afraid, looking only at the tip of the iceberg. So, I think it is absolutely right—and I will always urge the Government and whichever Secretary of State, be they in the Home Office, DMCS or the MOJ—to put the right measures and powers in place so that we act to prevent child sexual abuse and exploitation, prevent terrorist content from being shielded behind the platforms of encryption and, importantly, bring those involved to face justice. End-to-end encryption is one thing, but we need end-to-end justice for victims and the prevention of the most heinous crimes.

This is where we, as a House, must come together. I commend the hon. Member for Rotherham (Sarah Champion) in particular for her work relating to girls, everything to do with the grooming gangs, and the most appalling crimes against individuals, quite frankly. I will always urge colleagues to support the Bill, on which we will need to build going forward.

I think I can speak with experience about the difficulties in drafting legislation—both more broadly and specifically in this area, which is complex and challenging. It is hard to foresee the multiplicity of circumstances. My hon. Friend the Member for Folkestone and Hythe was absolutely right to say in his comments to the SNP spokesman, the hon. Member for Ochil and South Perthshire (John Nicolson), that we have to focus on illegal content. It is difficult to get the balance right between the lawful and harmful. The illegal side is what we must focus on.

I also know that many campaigners and individuals—they are not just campaigners, but families—have given heartbreaking and devastating accounts of their experiences of online harms. As legislators, we owe them this Bill, because although their suffering is not something that we will experience, it must bring about the type of changes that we all want to see for everyone—children, adults and vulnerable individuals.

May I ask the Minister for reassurances on the definition of “best endeavours”? As my right hon. Friend the Member for Basingstoke (Dame Maria Miller) touched on, when it comes to implementation, that will be the area where the rubber hits the road. That is where we will need to know that our collective work will be meaningful and will deliver protections—not just change, but protections. We must be honest about the many serious issues that will arise even after we pass the Bill—be it, God forbid, a major terrorist incident, or cases of child sexual exploitation—and there is a risk that, without clarity in this area, when a serious issue does arise, we may not know whether a provider undertook best endeavours. I think we owe it to everyone to ensure that we run a slide rule over this on every single granular detail.

Cases and issues relating to best endeavours are debated and discussed extensively in court cases, coroner inquests and for social services relating to child safeguarding issues, for example—all right hon. and hon. Members here will have experience of dealing with social services on behalf of their constituents in child protection cases—or, even worse, in serious case reviews or public inquiries that could come in future. I worry that in any response a provider could say that it did its best and had undertaken its best endeavours, as a defence. That would be unacceptable. That would lead those affected to feel as if they suffered an even greater injustice than the violations that they experienced. It is not clear whether best endeavours will be enough to change the culture, behaviour and attitudes of online platforms.

I raise best endeavours in the context of changing attitudes and cultures because in many institutions, that very issue is under live debate right now. That may be in policing, attitudes around women and girls or how we protect other vulnerable groups, even in other services such as the fire service, which we have heard about recently. It is important that we ask those questions and have the scrutiny. We need to hear more about what constitutes best endeavours. Who will hold the providers to account? Ofcom clearly has a role. I know the Minister will do a very earnest and diligent job to provide answers, but the best endeavours principle goes wider than just the Minister on the Front Bench—it goes across the whole of Government. He knows that we will give him every backing to use his sharp elbows—perhaps I can help with my sharp elbows—to ensure that others are held to account.

It will also be for Ofcom to give further details and guidance. As ever, the guidance will be so important. The guidance has to have teeth and statutory powers. It has to be able to put the mirror up and hold people to account. For example, would Ofcom be able, in its notices to providers, to instruct them to use specific technologies and programmes to tackle and end the exposure to exploitation, in relation to end-to-end encryption services, to protect victims? That is an open question, but one that could be put to Ofcom and could be an implementation test. There is no reason why we should not put a series of questions to Ofcom around how it would practically implement.

I would like to ask the Minister why vulnerable adults and victims of domestic abuse and violence against women and girls are not included. We must do everything in this House. This is not about being party political. When it comes to all our work on women and violence against women and girls, there should be no party politics whatsoever. We should ensure that what is right for one group is consistent and that the laws are strengthened. That will require the MOJ, as well as the Home Office, to ensure that the work is joined up in the right kind of way.

It is right that powers are available for dealing with terrorist threats and tackling child sexual abuse thoroughly. There is some good work around terrorist content. There is excellent work in GIFCT, the Global Internet Forum to Counter Terrorism. The technology companies are doing great work. There is international co-operation in this space. The House should take some comfort in the fact that the United Kingdom leads the world in this space. We owe our gratitude to our intelligence and security agencies. I give my thanks to MI5 in particular for its work and to counter-terrorism policing, because they have led the world robustly in this work.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend makes an important point about this being a cross-Government effort. The Online Safety Bill creates a regulatory framework for the internet, but we need to make sure that we have the right offences in law clearly defined. Then, it is easy to read them and cross them with legislation. If we do not have that, it is a job for the whole of Government.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

Exactly that. My hon. Friend is absolutely right. I come back to the point about drafting this legislation, which is not straightforward and easy because of the definitions. It is not just about what is in scope of the Bill but about the implications of the definitions and how they could be applied in law.

The Minister touched on the criminal side of things; interpretation in the criminal courts and how that would be applied in case law are the points that need to be fleshed out. This is where our work on CT is so important, because across the world with Five Eyes we have been consistent. Again, there are good models out there that can be built upon. We will not fix all this through one Bill—we know that. This Bill is foundational, which is why we must move forward.

On new clause 11, I seek clarity—in this respect, I need reassurance not from the Minister but from other parts of government—on how victims and survivors, whether of terrorist activity, domestic abuse or violence against women and girls, will be supported and protected by the new safeguards in the Bill, and by the work of the Victims’ Commissioner.

Rachel Maclean Portrait Rachel Maclean (Redditch) (Con)
- Hansard - - - Excerpts

I thank my right hon. Friend for sharing her remarks with the House. She is making an excellent speech based on her considerable experience. On the specific issue of child sexual abuse and exploitation, many organisations, such as the Internet Watch Foundation, are instrumental in removing reports and web pages containing that vile and disgusting material. In the April 2020 White Paper, the Government committed to look at how the Internet Watch Foundation could use its technical expertise in that field. Does she agree that it would be good to hear from the Minister about how the Internet Watch Foundation could work with Ofcom to assist victims?

17:00
Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

My hon. Friend is absolutely right. I thank her for not just her intervention but her steadfast work when she was a Home Office Minister with responsibility for safeguarding. I also thank the Internet Watch Foundation; many of the statistics and figures that we have been using about child sexual abuse and exploitation content, and the take-downs, are thanks to its work. There is some important work to do there. The Minister will be familiar with its work—[Interruption.] Exactly that.

We need the expertise of the Internet Watch Foundation, so it is about integrating that skillset. There is a great deal of expertise out there, including at the Internet Watch Foundation, at GIFCT on the CT side and, obviously, in our services and agencies. As my right hon. Friend the Member for Basingstoke said, it is crucial that we pool organisations’ expertise to implement the Bill, as we will not be able to create it all over again overnight in government.

I thank my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) for tabling new clause 16, which would create new offences to address the challenges caused by those who promote, encourage and assist self-harm. That has been the subject of much of the debate already, which is absolutely right when we think about the victims and their families. In particular, I thank the Samaritans and others for their work to highlight this important issue. I do not need to dwell on the Samaritans’ report, because I think all hon. Members have read it.

All hon. Members who spoke in the early stages of the Bill, which I did not because I was in government, highlighted this essential area. It is important to ensure that we do everything we can to address it in the right way. Like all right hon. and hon. Members, I pay tribute to the family of Molly Russell. There are no words for the suffering that they have endured, but their campaign of bravery, courage and fortitude aims to close every loophole to stop other young people being put at risk.

Right hon. and hon. Members meet young people in schools every week, and we are also parents and, in some cases, grandparents. To know that this grey area leaves so many youngsters at risk is devastating, so we have almost a collective corporate duty to stand up and do the right thing. The long and short of it is that we need to be satisfied, when passing the Bill, that we are taking action to protect vulnerable people and youngsters who are susceptible to dangerous communications.

As I have emphasised, we should also seek to punish those who cause and perpetrate this harm and do everything we can to protect those who are vulnerable, those with learning disabilities, those with mental health conditions, and those who are exposed to self-harm content. We need to protect them and we have a duty to do that, so I look forward to the Minister’s reply.

I welcome new clauses 45 to 50, tabled by my right hon. Friend the Member for Basingstoke. I pay tribute to her for her work; she has been a strong campaigner for protecting the privacy of individuals, especially women and children, and for closing loopholes that have enabled people to be humiliated or harmed in the ways she has spoken about so consistently in the House. I am pleased that the Deputy Prime Minister, my right hon. Friend the Member for Esher and Walton (Dominic Raab), announced last month that the Government would table amendments in the other place to criminalise the sharing of intimate images, photographs and videos without consent; that is long overdue. When I was Home Secretary I heard the most appalling cases, with which my right hon. Friend the Member for Basingstoke will be familiar. I have met so many victims and survivors, and we owe it to them to do the right thing.

It would be reassuring to hear not just from the Minister in this debate, but from other Ministers in the Departments involved in the Bill, to ensure they are consistent in giving voice to the issues and in working through their Ministries on the implementation—not just of this Bill, but of the golden thread that runs throughout the legislation. Over the last three years, we have rightly produced a lot of legislation to go after perpetrators, and support women and girls, including the Domestic Abuse Act 2021. We should use those platforms to stand up for the individuals affected by these issues.

I want to highlight the importance of the provisions to protect women and girls, particularly the victims and survivors of domestic abuse and violence. Some abusive partners and ex-partners use intimate images in their possession; as the Minister said, that is coercive control which means that the victim ends up living their life in fear. That is completely wrong. We have heard and experienced too many harrowing and shocking stories of women who have suffered as a result of the use of such images and videos. It must now be a priority for the criminal justice system, and the online platforms in particular, to remove such content. This is no longer a negotiation. Too many of us—including myself, when I was Home Secretary—have phoned platforms at weekends and insisted that they take down content. Quite frankly, I have then been told, “Twitter doesn’t work on a Saturday, Home Secretary” or “This is going to take time.” That is not acceptable. It is an absolute insult to the victims, and is morally reprehensible and wrong. The platforms must be held to account.

Hon. Members will be well aware of the Home Office’s work on the tackling violence against women and girls strategy. I pay tribute to all colleagues, but particularly my hon. Friend the Member for Redditch (Rachel Maclean), who was the Minister at the time. The strategy came about after much pain, sorrow and loss of life, and it garnered an unprecedented 180,000 responses. The range of concerns raised were predominantly related to the issues we are discussing today. We can no longer stay mute and turn a blind eye. We must ensure that the safety of women in the public space offline—on the streets—and online is respected. We know how women feel about the threats. The strategy highlighted so much; I do not want to go over it again, as it is well documented and I have spoken about it in the House many times.

It remains a cause of concern that the Bill does not include a specific VAWG code of practice. We want and need the Bill. We are not going to fix everything through it, but, having spent valued time with victims and survivors, I genuinely believe that we could move towards a code of practice. Colleagues, this is an area on which we should unite, and we should bring such a provision forward; it is vital.

Let me say a few words in support of new clause 23, which was tabled by my right hon. Friend the Member for Basingstoke. I have always been a vocal and strong supporter of services for victims of crime, and of victims full stop. I think it was 10 years ago that I stood in this House and proposed a victims code of practice—a victims Bill is coming, and we look forward to that as well. This Government have a strong record of putting more resources into support for victims, including the £440 million over three years, but it is imperative that offenders—those responsible for the harm caused to victims—are made to pay, and it is absolutely right that they should pay more in compensation.

Companies profiteering from online platforms where these harms are being perpetrated should be held to account. When companies fail in their duties and have been found wanting, they must make a contribution for the harm caused. There are ways in which we can do that. There has been a debate already, and I heard the hon. Member for Pontypridd (Alex Davies-Jones) speak for the Opposition about one way, but I think we should be much more specific now, particularly in individual cases. I want to see those companies pay the price for their crimes, and I expect the financial penalties issued to reflect the severity of the harm caused—we should support that—and that such money should go to supporting the victims.

I pay tribute to the charities, advocacy groups and other groups that, day in and day out, have supported the victims of crime and of online harms. I have had an insight into that work from my former role in Government, but we should never underestimate how traumatic and harrowing it is. I say that about the support groups, but we have to magnify that multiple times for the victims. This is one area where we must ensure that more is done to provide extra resources for them. I look forward to hearing more from the Minister, but also from Ministers from other Departments in this space.

I will conclude on new clause 28, which has already been raised, on the advocacy body for children. There is a long way to go with this—there really is. Children are harmed in just too many ways, and the harm is unspeakable. We have touched on this in earlier debates and discussions on the Bill, in relation to child users on online platforms, and there will be further harm. I gently urge the Government —if not today or through this Bill, then later—to think about how we can pull together the skills and expertise in organisations outside this House and outside Government that give voice to children who have nowhere else to go.

This is not just about the online space; in the cases in the constituency of the hon. Member for Rotherham (Sarah Champion) and other constituencies, we have seen children being harmed under cover. Statutory services failed them and the state failed them. It was state institutional failure that let children down in the cases in Rotherham and other child grooming cases. We could see that all over again in the online space, and I really urge the Government to make sure that that does not happen—and actually never happens again, because those cases are far too harrowing.

There really is a lot here, and we must come together to ensure that the Bill comes to pass, but there are so many other areas where we can collectively put aside party politics and give voice to those who really need representation.

Margaret Hodge Portrait Dame Margaret Hodge
- View Speech - Hansard - - - Excerpts

I pay tribute to all the relatives and families of the victims of online abuse who have chosen to be with us today. I am sure that, for a lot of you, our debate is very dry and detached, yet we would not be here but for you. Our hearts are with you all.

I welcome the Minister to his new role. I hope that he will guide his Bill with the same spirit set by his predecessors, the right hon. Member for Croydon South (Chris Philp) and the hon. Member for Folkestone and Hythe (Damian Collins), who is present today and has done much work on this issue. Both Ministers listened and accepted ideas suggested by Back Benchers across the House. As a result, we had a better Bill.

17:15
We all understand that this is groundbreaking legislation, and that it therefore presents us with complex challenges as we try to legislate to achieve the best answers to the horrific, fast-changing and ever-growing problems of online abuse. Given that complexity, and given that this is our first attempt at regulating online platforms, the new Minister would do well to build on the legacy of his predecessors and approach the amendments on which there are votes tonight as wholly constructive. The policies we are proposing enjoy genuine cross-party support, and are proposed to help the Minister not to cause him problems.
Let me express particular support for new clauses 45 to 50, in the name of the right hon. Member for Basingstoke (Dame Maria Miller), which tackle the abhorrent misogynistic problem of intimate image abuse, and amendments 1 to 14, in the name of the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), which address the issue of smaller platforms falling into category 2, which is now outside the scope of regulations. We all know that the smallest platforms can present the greatest risk. The killing of 51 people in the mosque in Christchurch New Zealand is probably the most egregious example, as the individual concerned used 8chan to plan his attack.
New clause 15, which I have tabled, seeks to place responsibility for complying with the new law unequivocally on the shoulders of individual directors of online platforms. As the Bill stands, criminal liability is enforced only when senior tech executives fail to co-operate with information requests from Ofcom. I agree that is far too limited, as the right hon. and learned Member for Kenilworth and Southam said. The Bill allows executives to choose and name the individual who Ofcom will hold to account, so that the company itself, not Ofcom, decides who is liable. That is simply not good enough.
Let me explain the thinking behind new clause 15. The purpose of the Bill is to change behaviour. Our experience in many other spheres of life tells us that the most effective way of achieving such change is to make individuals at the top of an organisation personally responsible for the behaviour of that organisation. We need to hold the chairmen and women, directors and senior executives to account by making those individuals personally liable for the practices and actions of their organisation.
Let us look at the construction industry, for example. Years ago, building workers dying on construction sites was an all too regular feature of the construction industry. Only when we reformed health and safety legislation and made the directors of construction companies personally responsible and liable for health and safety standards on their sites did we see an incredible 90% drop in deaths on building sites. Similarly, when we introduced corporate and director liability offences in the Bribery Act 2010, companies stopped trying to bribe their way into contracts.
It is not that we want to lock up directors of construction companies or trading companies, or indeed directors of online platforms; it is that the threat of personal criminal prosecution is the most powerful and effective way of changing behaviour. It is just the sort of deterrent tool that the Bill needs if it is to protect children and adults from online harms. That is especially important in this context, because the business model that underpins the profits that platforms enjoy encourages harmful content. The platforms need to encourage traffic on their sites, because the greater the traffic, the more attractive their sites become to advertisers; and the more advertising revenue they secure, the higher the profits they enjoy.
Harmful content attracts more traffic and so supports the platforms’ business objectives. We know that from studies such as the one by Harvard law professor Jonathan Zittrain, which showed that posts that tiptoe close to violating platforms’ terms and conditions generate far more engagement. We also know that from Mark Zuckerberg’s decisions in the lead-up to and just after the 2020 presidential elections, when he personally authorised tweaks to the Facebook algorithm to reduce the spread of election misinformation. However, after the election, despite officials at Facebook asking for the change to stay, he ensured that the previous algorithm was placed back on. An internal Facebook memo revealed that the tweak preventing fake news had led to “a decrease in sessions”, which made his offer less attractive to advertising and impacted his profits. Restoring fake news helped restore his profits.
The incentives in online platforms’ business models promote rather than prevent online harms, and we will not break those incentives by threatening to fine companies. We know from our experience elsewhere that, even at 10% of global revenue, such fines will inevitably be viewed as a cost to business, which will simply be passed on by raising advertising charges. However, we can and will break the incentives in the business model if we make Mark Zuckerberg or Elon Musk personally responsible for breaking the rules. It will not mean that we will lock them up, much as some of us might be tempted to do so. It will, however, provide that most powerful incentive that we have as legislators to change behaviour.
Furthermore, we know that the directors of online platforms personally take decisions in relation to harmful content, so they should be personally held to account. In 2018, Facebook’s algorithm was promoting posts for users in Myanmar that incited violence against protesters. The whistleblower Frances Haugen showed evidence that Facebook was aware that its engagement-based content was fuelling the violence, but it continued to roll it out on its platforms worldwide without checks. Decisions made at the top resulted in direct ethnic violence on the ground. That same year, Zuckerberg gave a host of interviews defending his decision to keep holocaust-denial on his platform, saying he did not believe that posts should be taken down for people getting it wrong. The debate continued for two years until 2020, when only after months of protest he finally decided to remove that abhorrent content.
In what world do we live where overpaid executives running around in their jeans and sneakers are allowed to make decisions on the hoof about how their platforms should be regulated without being held to account for their actions?
David Davis Portrait Mr David Davis
- Hansard - - - Excerpts

The right hon. Lady and I have co-operated to deal with international corporate villains, so I am interested in her proposal. However, a great number of these actions are taken by algorithms—I speak as someone who was taken down by a Google algorithm—so what happens then? I see no reason why we should not penalise directors, but how do we establish culpability?

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

That is for an investigation by the appropriate enforcement agency—Ofcom et al.—and if there is evidence that culpability rests with the managing director, the owner or whoever, they should be prosecuted. It is as simple as that. A case would have to be established through evidence, and that should be carried out by the enforcement agency. I do not think that this is any different from any other form of financial or other crime. In fact, it is from my experience in that that I came to this conclusion.

John Penrose Portrait John Penrose (Weston-super-Mare) (Con)
- Hansard - - - Excerpts

The right hon. Lady is making a powerful case, particularly on the effective enforcement of rules to ensure that they bite properly and that people genuinely pay attention to them. She gave the example of a senior executive talking about whether people should be stopped for getting it wrong—I think the case she mentioned was holocaust denial—by making factually inaccurate statements or allowing factually inaccurate statements to persist on their platform. May I suggest that her measures would be even stronger if she were to support new clause 34, which I have tabled? My new clause would require factual inaccuracy to become wrong, to be prevented and to be pursued by the kinds of regulators she is talking about. It would be a much stronger basis on which her measure could then abut.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Indeed. The way the hon. Gentleman describes his new clause, which I will look at, is absolutely right, but can I just make a more general point because it speaks to the point about legal but harmful? What I really fear with the legal but harmful rule is that we create more and more laws to make content illegal and that, ironically, locks up more and more people, rather than creates structures and systems that will prevent the harm occurring in the first place. So I am not always in favour of new laws simply criminalising individuals. I would love us to have kept to the legal but harmful route.

We can look to Elon Musk’s recent controversial takeover of Twitter. Decisions taken by Twitter’s newest owner—by Elon Musk himself—saw use of the N-word increase by nearly 500% within 12 hours of acquisition. And allowing Donald Trump back on Twitter gives a chilling permission to Trump and others to use the site yet again to incite violence.

The tech giants know that their business models are dangerous. Platforms can train their systems to recognise so-called borderline content and reduce engagement. However, it is for business reasons, and business reasons alone, that they actively choose not to do that. In fact, they do the opposite and promote content known to trigger extreme emotions. These platforms are like a “danger for profit” machine, and the decision to allow that exploitation is coming from the top. Do not take my word for it; just listen to the words of Ian Russell. He has said:

“The only person that I’ve ever come across in this whole world…that thought that content”—

the content that Molly viewed—

“was safe was…Meta.”

There is a huge disconnect between what silicon valley executives think is safe and what we expect, both for ourselves and for our children. By introducing liability for directors, the behaviour of these companies might finally change. Experience elsewhere has shown us that that would prove to be the most effective way of keeping online users safe. New clause 17 would hold directors of a regulated service personally liable on the grounds that they have failed, or are failing, to comply with any duties set in relation to their service, for instance failure that leads to the death of a child. The new clause further states that the decision on who was liable would be made by Ofcom, not the provider, meaning that responsibility could not be shirked.

I say to all Members that if we really want to reduce the amount of harmful abuse online, then making senior directors personally liable is a very good way of achieving it. Some 82% of UK adults agree with us, Labour Front Benchers agree and Back Benchers across the House agree. So I urge the Government to rethink their position on director liability and support new clause 17 as a cross-party amendment. I really think it will make a difference.

Damian Collins Portrait Damian Collins
- View Speech - Hansard - - - Excerpts

As Members know, there is a tradition in the United States that when the President signs a new Bill into law, people gather around him in the Oval Office, and multiple pens are used and presented to people who had a part in that Bill being drafted. If we required the King to do something similar with this Bill and gave a pen to every Minister, every Member who had served on a scrutiny Committee and every hon. Member who introduced an amendment that was accepted, we would need a lot of pens and it would take a long time. In some ways, however, that shows the House at its best; the Bill’s introduction has been a highly collaborative process.

The right hon. Member for Barking (Dame Margaret Hodge) was kind in her words about me and my right hon. Friend the Member for Croydon South (Chris Philp). I know that my successor will continue in the same tradition and, more importantly, that he is supported by a team of officials who have dedicated, in some cases, years of their career to the Bill, who care deeply about it and who want to see it introduced with success. I had better be nice to them because some of them are sitting in the Box.

17:30
It is easy to consider the Bill on Report as it now, thinking about some areas where Members think it goes too far and other areas where Members think it does not quite go far enough, but let us not lose sight of the fact that we are establishing a world-leading regulatory system. It is not the first in the world, but it goes further than any other system in the world in the scope of offences. Companies will have to show priority activity in identifying and mitigating the harm of the unlawful activity. A regulator will be empowered to understand what is going on inside the companies, challenge them on the way that they enforce their codes and hold them to account for that. We currently have the ability to do none of those things. Creating a regulator with that statutory power and the power to fine and demand evidence and information is really important.
The case of Molly Russell has rightly been cited as so important many times in this debate. One of the hardships was not just the tragedy that the family had to endure and the cold, hard, terrible fact—presented by the coroner—that social media platforms had contributed to the death of their daughter, but that it took years for the family and the coroner, going about his lawful duty, to get hold of the information that was required and to bring it to people’s attention. I have had conversations with social media companies about how they combat self-harm and suicide, including with TikTok about what they were doing to combat the “blackout challenge”, which has led to the death of children in this country and around the world. They reassure us that they have systems in place to deal with that and that they are doing all that they can, but we do not know the truth. We do not know what they can see and we have no legal power to readily get our hands on that information and publish it. That will change.
This is a systems Bill—the hon. Member for Pontypridd (Alex Davies-Jones) and I have had that conversation over the Dispatch Boxes—because we are principally regulating the algorithms and artificial intelligence that drive the recommendation tools on platforms. The right hon. Member for Barking spoke about that, as have other Members. When we describe pieces of content, they are exemplars of the problem, but the biggest problem is the systems effect. If people posted individually and organically, and that sat on a Facebook page or a YouTube channel that hardly anyone saw, the amount of harm done would be very small. The fact is, however, that those companies have created systems to promote content to people by data-profiling them to keep them on their site longer and to get them coming back more frequently. That has been done for a business reason—to make money. Most of the platforms are basically advertising platforms making money out of other people’s content.
That point touches on every issue that Members have raised so far today. The Bill squarely makes the companies fully legally liable for their business activity, what they have designed to make money for themselves and the detriment that that can cause other people. That amplification of content, giving people more of what they think they want, is seen as a net positive, and people think that it therefore must always be positive, but it can be extremely damaging and negative.
That is why the new measures that the Government are introducing on combating self-harm and suicide are so important. Like other Members, I think that the proposal from my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) is important, and I hope that the Government’s amendment will address the issue fully. We are talking not just about the existing, very high bar in the law on assisting suicide, which almost means being present and part of the act. The act of consistently, systematically promoting content that exacerbates depression, anxiety and suicidal feelings among anyone, but particularly young people, must be an offence in law and the companies must be held to account for that.
When Ian Russell spoke about his daughter’s experience, I thought it was particularly moving when he said that police officers were not allowed to view the content on their own. They worked in shifts for short periods of time, yet that content was pushed at a vulnerable girl by a social media platform algorithm when she was on her own, probably late at night, with no one else to see it and no one to protect her. That was done in a systematic way, consistently, over a lengthy period of time. People should be held to account for that. It is outrageous—it is disgusting—that that was allowed to happen. Preventing that is one of the changes that the Bill will help us to deliver.
David Davis Portrait Mr David Davis
- Hansard - - - Excerpts

I listened with interest to the comments of the right hon. Member for Barking (Dame Margaret Hodge) about who should be held responsible. I am trying to think through how that would work in practice. Frankly, the adjudication mechanism, under Ofcom or whoever it might be, would probably take a rather different view in the case of a company: bluntly, it would go for “on the balance of probabilities”, whereas with an individual it might go for “beyond reasonable doubt”. I am struggling —really struggling—with the question of which would work best. Does my hon. Friend have a view?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend raises a very good question. As well as having a named individual with criminal liability for the supplying of information, should there be somebody who is accountable within a company, whether that comes with criminal sanctions or not—somebody whose job it is to know? As all hon. Members know if they have served on the Digital, Culture, Media and Sport Committee, which I chaired, on the Public Accounts Committee or on other Select Committees that have questioned people from the big tech companies, the frustrating thing is that no matter who they put up, it never seems to be the person who actually knows.

There needs to be someone who is legally liable, whether or not they have criminal liability, and is the accountable officer. In the same way as in a financial institution, it is really important to have someone whose job it is to know what is going on and who has certain liabilities. The Bill gives Ofcom the power to seek information and to appoint experts within a company to dig information out and work with the company to get it, but the companies need to feel the same sense of liability that a bank would if its systems had been used to launder money and it had not raised a flag.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I will dare to give way to yet another former Committee Chair—the former chair of the Public Accounts Committee.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

I draw all hon. Members’ attention to issues relating to Barclays Bank in the wake of the economic crisis. An authority—I think it was the Serious Fraud Office—attempted to hold both the bank and its directors to account, but it failed because there was not a corporate criminal liability clause that worked. It was too difficult. Putting such a provision in the Bill would be a means of holding individual directors as well as companies to account, whatever standard of proof was used.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I thank the right hon. Lady for that information.

Let me move on to the debate about encryption, which my right hon. Friend the Member for Haltemprice and Howden has mentioned. I think it is important that Ofcom and law enforcement agencies be able to access information from companies that could be useful in prosecuting cases related to terrorism and child sexual exploitation. No one is suggesting that encrypted messaging services such as WhatsApp should be de-encrypted, and there is no requirement in the Bill for encryption to end, but we might ask how Meta makes money out of WhatsApp when it appears to be free. One way in which it makes money is by gathering huge amounts of data and information about the people who use it, about the names of WhatsApp groups and about the websites people visit before and after sending messages. It gathers a lot of background metadata about people’s activity around using the app and service.

If someone has visited a website on which severe illegal activity is taking place and has then used a messaging service, and the person to whom they sent the message has done the same, it should be grounds for investigation. It should be easy for law enforcement to get hold of the relevant information without the companies resisting. It should be possible for Ofcom to ask questions about how readily the companies make that information available. That is what the Government seek to do through their amendments on encryption. They are not about creating a back door for encryption, which could create other dangers, and not just on freedom of expression grounds: once a back door to a system is created, even if it is only for the company itself or for law enforcement, other people tend to find their way in.

Ian Paisley Portrait Ian Paisley (North Antrim) (DUP)
- Hansard - - - Excerpts

I thank the hon. Member for jointly sponsoring my private Member’s Bill, the Digital Devices (Access for Next of Kin) Bill. Does he agree that the best way to make progress is to ensure open access for the next of kin to devices that a deceased person leaves behind?

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The hon. Member makes an important point. Baroness Kidron’s amendment has been referred to; I anticipate that future amendments in the House of Lords will also seek to address the issue, which our Joint Committee looked at carefully in our pre-legislative scrutiny.

It should be much easier than it has been for the Russell family and the coroner to gain access to such important information. However, depending on the nature of the case, there may well be times when it would be wrong for families to have access. I think there has to be an expedited and official process through which the information can be sought, rather than a general provision, because some cases are complicated. There should not be a general right in law, but it needs to be a lot easier than it is. Companies should make the information available much more readily than they have done. The Molly Russell inquest had to be delayed for four months because of the late release of thousands of pages of information from Meta to the coroner. That is clearly not acceptable either.

My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) has tabled an amendment relating to small and risky platforms. The categorisation of platforms on the basis of size was linked to duties under the “legal but harmful” provisions, which we expect now to change. The priority illegal harms apply to platforms of all sizes. Surely when illegal activity is taking place on any platform of any size—I hope that the Minister will clarify this later—Ofcom must have the right to intervene and start asking questions. I think that, in practice, that is how we should expect the system to work.

Like other Members who served on the Joint Committee —I am thinking particularly of my hon. Friends the Members for Watford (Dean Russell) and for Stourbridge (Suzanne Webb), both of whom spoke so passionately about this subject, and the hon. Member for Ochil and South Perthshire (John Nicolson) raised it as well—I was delighted to see that the Government had tabled amendments to cover Zach’s law. The fact that someone can deliberately seek out a person with epilepsy and target that person with flashing images with the intention of causing a seizure is a terrible example of the way in which systems can be abused. It is wrong for the platforms to be neutral and have no obligation to identify and stop that action, but the action is wrong in practice as well, and it demonstrates the need for us to ensure that the law keeps pace with the nature of new offences. I was very proud to meet Zach and his mother in October. I said to them then that their work had changed the law, and I am glad that the Government have tabled those amendments.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

May I pay tribute to my hon. Friend for his chairmanship of the Joint Committee last year? We covered a wide range of challenging ethical, moral and technical decisions, with work across both Houses, and I think that the decisions contained in our report informed many of the Government amendments, but it was my hon. Friend’s chairmanship that helped to guide us through that period.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful to my hon. Friend for what he has said, and for his significant work on the Committee.

There is a great deal that we could say about this Bill, but let me end by touching on an important topic that I think my hon. Friend the Member for Dover (Mrs Elphicke) will speak about later: the way in which social media platforms are used by people trafficking gangs to recruit those who can help them with bringing people into the country in small boats. It was right that the Government included immigration offences in the list of priority legal harms in schedule 7. It was also right that, following a recommendation from the Joint Committee, they included fraud and scam ads in the scope of the Bill.

We have already accepted, in principle, that advertising can be within the Bill’s scope in certain circumstances, and that priority legal harms can be written into the Bill and identified as such. As I understand it, my hon. Friend’s amendment seeks to bring advertising services—not just organic posts on social media platforms—into the Bill’s scope as well. I know that the Government want to consider illegal activity in advertising as part of the online advertising review, but I hope that this could be an expedited process running in parallel with the Bill as it completes its stages. Illegal activity in advertising would not be allowed in the offline world. Newspaper editors are legally liable for what appears in their papers, and broadcasters can lose their licence if they allow illegal content to feature in advertising. We do not yet have the same enforcement mechanism through the advertising industry with the big online platforms, such as Google and Facebook, where the bulk of display advertising now goes. Their advertising market is bigger than the television advertising market. We are seeing serious examples of illegal activity, and it cannot be right that while such examples cannot be posted on a Facebook page, if money is put behind them and they are run as advertisements they can.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

My hon. Friend is making a very thoughtful speech. This is an important point, because it relates to criminality fuelled by online activity. We have discussed that before in the context of advertising. Tools already exist throughout Government to pick up such criminality, but we need the Bill to integrate them and drive the right outcomes—to stop this criminality, to secure the necessary prosecutions, and to bring about the deterrent effect that my hon. Friend the Member for Dover (Mrs Elphicke) is pursuing.

Natalie Elphicke Portrait Mrs Natalie Elphicke (Dover) (Con)
- Hansard - - - Excerpts

Will my right hon. Friend give way?

Natalie Elphicke Portrait Mrs Elphicke
- Hansard - - - Excerpts

I am grateful to my right hon. Friend raising this and for his support in this important area that affects our constituencies so much. I will be speaking later to the details of this, which go beyond the advertising payment to the usage, showing and sharing of this. As he has mentioned schedule 7, does he agree that there is—as I have set out in my amendment—a strong case for making sure that it covers all those illegal immigration and modern slavery offences, given the incredible harm that is being caused and that we see on a day-to-day basis?

17:45
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I agree with my hon. Friend, which is why I think it is important that immigration offences were included in schedule 7 of the Bill. I think this is something my right hon. Friend the Member for Croydon South felt strongly about, having been Immigration Minister before he was a tech Minister. It is right that this has been included in the scope of the Bill and I hope that when the code of practice is developed around that, the scope of those offences will be made clear.

On whether advertising should be included as well as other postings, it may well be that at this time the Online Safety Bill is not necessarily the vehicle through which that needs to be incorporated. It could be done separately through the review of the online advertising code. Either way, these are loopholes that need to be closed, and the debate around the Online Safety Bill has brought about a recognition of what offences can be brought within the regulatory scope of the Bill and where Ofcom can have a role in enforcing those measures. Indeed, the measures on disinformation in the National Security Bill are good example of that. In some ways it required the National Security Bill to create the offence, and then the offence could be read across into the Online Safety Bill and Ofcom could play a role in regulating the platforms to ensure that they complied with requests to take down networks of Russian state-backed disinformation. Something similar could work with immigration offences as well, but whether it is done that way or through the online advertising review or through new legislation, this is a loophole that needs to be closed.

Sarah Champion Portrait Sarah Champion (Rotherham) (Lab)
- View Speech - Hansard - - - Excerpts

I am learning so much sitting here. I am going to speak just on child protection, but all of us are vulnerable to online harms, so I am really grateful to hon. Members across the House who are bringing their specialisms to this debate with the sole aim of strengthening this piece of legislation to protect all of us. I really hope the Government listen to what is being said, because there seems to be a huge amount of consensus on this.

The reason I am focusing on child protection is that every police officer in this field that I talk to says that, in almost every case, abusers are now finding children first through online platforms. We cannot keep up with the speed or the scale of this, so I look to this Bill to try to do so much more. My frustration is that when the Bill first started, we were very much seen as a world leader in this field, but now the abuse has become so prolific, other countries have stepped in and we are sadly lagging behind, so I really hope the Minister does everything he can to get this into law as soon as possible.

Although there are aspects of the Bill that go a long way towards tackling child abuse online, it is far from perfect. I want to speak on a number of specific ways in which the Minister can hopefully improve it. The NSPCC has warned that over 100 online grooming and child abuse image crimes are likely to be recorded every day while we wait for this crucial legislation to pass. Of course, that is only the cases that are recorded. The number is going to be far greater than that. There are vital protections in the Bill, but there is a real threat that the use of virtual private networks—VPNs—could undermine the effectiveness of these measures. VPNs allow internet users to hide their private information, such as their location and data. They are commonly used, and often advertised, as a way for people to protect their data or watch online content. For example, on TV services such as Netflix, people might be able to access something only in the US, so they could use a VPN to circumnavigate that to enable them to watch it in this country.

During the Bill’s evidence sessions, Professor Clare McGlynn said that 75% of children aged 16 and 17 used, or knew how to use, a VPN, which means that they can avoid age verification controls. So if companies use age assurance tools, as listed in the safety duties of this Bill, there is no guarantee that they will provide the protections that are needed. I am also concerned that the use of VPNs could act as a barrier to removing indecent or illegal material from the internet. The Internet Watch Foundation uses a blocking list to remove this content from internet service providers, but users with a VPN are usually not protected through the provisions they use. It also concerns me that a VPN could be used in court to circumnavigate this legislation, which is very much based in the UK. Have the Government tested what will happen if someone uses a VPN to give the appearance of being overseas?

My new clause 54 would require the Secretary of State to publish, within six months of the Bill’s passage, a report on the effect of VPN use on Ofcom’s ability to enforce the requirements under clause 112. If VPNs cause significant issues, the Government must identify those issues and find solutions, rather than avoiding difficult problems.

New clause 28 would establish a user advocacy body to represent the interests of children in regulatory decisions. Children are not a homogenous group, and an advocacy body could reflect their diverse opinions and experiences. This new clause is widely supported in the House, as we have heard, and the NSPCC has argued that it would be an important way to counterbalance the attempts of big tech companies to reduce their obligations, which are placing their interests over children’s needs.

I would like to see more third sector organisations consulted on the code of practice. The Internet Watch Foundation, which many Members have discussed, already has the necessary expertise to drastically reduce the amount of child sexual abuse material on the internet. The Government must work with the IWF and build on its knowledge of web page blocking and image hashing.

Girls in particular face increased risk on social media, with the NSPCC reporting that nearly a quarter of girls who have taken a nude photo have had their image sent to someone else online without their permission. New clauses 45 to 50 would provide important protections to women and girls from intimate image abuse, by making the non-consensual sharing of such photos illegal. I am pleased that the Government have announced that they will look into introducing these measures in the other place, but we are yet to see any measures to compare with these new clauses.

In the face of the huge increase in online abuse, victims’ services must have the necessary means to provide specialist support. Refuge’s tech abuse team, for example, is highly effective at improving outcomes for thousands of survivors, but the demand for its services is rapidly increasing. It is only right that new clause 23 is instated so that a good proportion of the revenue made from the Bill’s provisions goes towards funding these vital services.

The landmark report by the independent inquiry into child sexual abuse recently highlighted that, between 2017-18 and 2020-21, there was an approximately 53% rise in recorded grooming offences. With this crime increasingly taking place online, the report emphasised that internet companies will need more moderators to aid technology in identifying this complex type of abuse. I urge the Minister to also require internet companies to provide sufficient and meaningful support to those moderators, who have to view and deal with disturbing images and videos on a daily basis. They, as well as the victims of these horrendous crimes, deserve our support.

I have consistently advocated for increased prevention of abuse, particularly through education in schools, but we must also ensure that adults, particularly parents, are educated about the threats online. Internet Matters found that parents underestimate the extent to which their children are having negative experiences online, and that the majority of parents believe their 14 to 16-year-olds know more about technology than they do.

The example that most sticks in my mind was provided by the then police chief in charge of child protection, who said, “What is happening on a Sunday night is that the family are sitting in the living room, all watching telly together. The teenager is online, and is being abused online.” In his words, “You wouldn’t let a young child go and open the door without knowing who is there, but that is what we do every day by giving them their iPad.”

If parents, guardians, teachers and other professionals are not aware of the risks and safeguards, how are they able to protect children online? I strongly encourage the Government to accept new clauses 29 and 30, which would place an additional duty on Ofcom to promote media literacy. Minister, you have the potential—

Sarah Champion Portrait Sarah Champion
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker. The Minister has the potential to do so much with this Bill. I urge him to do it, and to do it speedily, because that is what this country really needs.

David Davis Portrait Mr David Davis
- View Speech - Hansard - - - Excerpts

I do not agree with every detail of what the hon. Member for Rotherham (Sarah Champion) said, but I share her aims. She has exactly the right surname for what she does in standing up for children.

To avoid the risk of giving my Whip a seizure, I congratulate the Government and the Minister on all they have done so far, both in delaying the Bill and in modifying their stance.

My hon. Friend the Member for Solihull (Julian Knight), who is no longer in the Chamber, said that this is five Bills in one and should have had massively more time. At the risk of sounding like a very old man, there was a time when this Bill would have had five days on Report. That is what should have happened with such a big Bill.

Opposition Members will not agree, but I am grateful that the Government decided to remove the legal but harmful clause. The simple fact is that the hon. Member for Pontypridd (Alex Davies-Jones) and I differ not in our aim—my new clause 16 is specifically designed to protect children—but on the method of achieving it. Once upon a time, there was a tradition that this Chamber would consider a Companies Bill every year, because things change over time. We ought to have a digital Bill every year, specifically to address not legal but harmful but, “Is it harmful enough to be made illegal?” Obviously, self-harm material is harmful enough to be made illegal.

The hon. Lady and I have similar aims, but we have different perspectives on how to attack this. My perspective is as someone who has seen many pieces of legislation go badly wrong despite the best of intentions.

The Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), knows he is a favourite of mine. He did a fantastic job in his previous role. I think this Bill is a huge improvement, but he has a lot more to do, as he recognises with the Bill returning to Committee.

One area on which I disagree with many of my hon. and right hon. Friends is the question of encryption. The Bill allows Ofcom to issue notices directing companies to use “accredited technology,” but it might as well say “magic,” because we do not know what is meant by “accredited technology.” Clause 104 will create a pressure to undermine the end-to-end encryption that is not only desirable but crucial to our telecommunications. The clause sounds innocuous and legalistic, especially given that the notices will be issued to remove terrorist or child sexual exploitation content, which we all agree has no place online.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Rather than it being magic, does my right hon. Friend agree that a company could not ignore it if we demystified the process? By saying there is an existing technology that is available and proven to work, the company would have to explain why it is not using that technology or something better.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

I will come back to that in some detail.

The first time I used encryption it was one-time pads and Morse, so it was a long time ago. The last time was much more recent. The issue here is that clause 104 causes pressure by requiring real-time decryption. The only way to do that is by either having it unencrypted on the server, having it weakly encrypted or creating a back door. I am talking not about metadata, which I will come back to in a second, but about content. In that context, if the content needs to be rapidly accessible, it is bound to lead to weakened encryption.

This is perhaps a debate for a specialist forum, but it is very dangerous in a whole series of areas. What do we use encryption for? We use it for banking, for legal and privileged conversations, and for conversations with our constituents and families. I could go on and on about the areas in which encryption matters.

Adam Afriyie Portrait Adam Afriyie (Windsor) (Con)
- Hansard - - - Excerpts

My right hon. Friend will be aware that the measure will encompass every single telephone conversation when it switches to IP. That is data, too.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

That is correct. The companies cannot easily focus the measure on malicious content alone, and that is the problem. With everything we do in dealing with enforcing the law, we have to balance the extent to which we make the job of the law enforcement agency possible—ideally, easy—against the rights we take away from innocent citizens. That is the key balance. Many bad things happen in households but we do not require people to live in houses with glass walls. That shows the intrinsic problem we have.

18:09
That imposition on privacy cannot sit comfortably with anybody who takes privacy rights seriously. As an aside, let me say to the House that the last thing we need, given that we want something to happen quickly, or at least effectively and soon, is to find ourselves in a Supreme Court case or a European Court case on privacy imposition. I do not think that is necessary. That is where I think the argument stands. If we end up in a case like that, it will not be about paedophiles or criminals; it will be about the weakening of the encryption of the data of an investigative journalist or a whistleblower. That is where it will come back to haunt us and we have to put that test on it. That is my main opening gambit.
I am conscious that everybody has spoken for quite a long time, so I am trying to make this short. However, the other thing I wish to say is that we have weapons, particularly in terms of metadata. If I recall correctly, Facebook takes down about 300,000 or so sites for paedophile content alone and millions for other reasons; so the use of metadata is very important. Europol carried out a survey of what was useful in terms of the data arising from the internet, social media and the like, and content was put at No. 7, after all sorts of other data. I will not labour the point, but I just worry about this. We need to get it right and so far we have taken more of a blunderbuss approach than a rifle shot. We need to correct that, which is what my two amendments are about.
The other thing I briefly wish to talk about is new clause 16, which a number of people have mentioned in favourable terms. It will make it an offence to encourage or assist another person to self-harm—that includes suicide. I know that the Government have difficulties getting their proposed provisions right in how they interact with other legislation—the suicide legislation and so on. I will be pressing the new clause to a vote. I urge the Government to take this new clause and to amend the Bill again in the Lords if it is not quite perfect. I want to be sure that this provision goes into the legislation. It comes back to the philosophical distinction involving “legal but harmful”, a decision put first in the hands of a Minister and then in the hands of an entirely Whip-chosen statutory instrument Committee, neither of which are trustworthy vehicles for the protection of free speech. My approach will take it from there and put it in the hands of this Chamber and the other place. Our control, in as much as we control the internet, should be through primary legislation, with maximum scrutiny, exposure and democratic content. If we do it in that way, nobody can argue with us and we will be world leaders, because we are pretty much the only people who can do that.
As I say, we should come back to this area time and time again, because this Bill will not be the last shot at it. People have talked about the “grey area”. How do we assess a grey area? Do I trust Whitehall to do it? No, I do not; good Minister though we have, he will not always be there and another Minister will be in place. We may have the British equivalent of Trump one day, who knows, and we do not want to leave this provision in that context. We want this House, and the public scrutiny that this Chamber gets, to be in control of it.
William Cash Portrait Sir William Cash (Stone) (Con)
- Hansard - - - Excerpts

Many years ago, in the 1970s, I was much involved in the Protection of Children Bill, which was one of the first steps in condemning and making illegal explicit imagery of children and their involvement in the making of such films. We then had the broadcasting Acts and the video Acts, and I was very much involved at that time in saying that we ought to prohibit such things in videos and so on. I got an enormous amount of flack for that. We have now moved right the way forward and it is tremendous to see not only the Government but the Opposition co-operating together on this theme. I very much sympathise with not only what my right hon. Friend has just said—I am very inclined to support his new clause for that reason— but with what the right hon. Member for Barking (Dame Margaret Hodge) said. I was deeply impressed by the way in which she presented the argument about the personal liability of directors. We cannot distinguish between a company and the people who run it, and I am interested to hear what the Government have to say in reply to that.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

I very much agree with my hon. Friend on that. He and I have been allies in the past—and sometimes opponents—and he has often been far ahead of other people. I am afraid that I do not remember the example from the 1970s, as that was before even my time here, but I remember the intervention he made in the 1990s and the fuss it caused. From that point of view, I absolutely agree with him. My new clause is clearly worded and I hope the House will give it proper consideration. It is important that we put something in the Bill on this issue, even if the Government, quite properly, amend it later.

I wish to raise one last point, which has come up as we have talked through these issues. I refer to the question of individual responsibility. One or two hon. Ladies on the Opposition Benches have cited algorithmic outcomes. As I said to the right hon. Member for Barking, I am worried about how we place the responsibility, and how it would lead the courts to behave, and so on. We will debate that in the next few days and when the Bill comes back again.

There is one other issue that nothing in this Bill covers, and I am not entirely sure why. Much of the behaviour pattern is algorithmic and it is algorithmic with an explicit design. As a number of people have said, it is designed as clickbait; it is designed to bring people back. We may get to a point, particularly if we come back to this year after year, of saying, “There are going to be rules about your algorithms, so you have to write it into the algorithm. You will not use certain sorts of content, pornographic content and so on, as clickbait.” We need to think about that in a sophisticated and subtle way. I am looking at my hon. Friend the Member for Folkestone and Hythe (Damian Collins), the ex-Chairman of the Select Committee, on this issue. If we are going to be the innovators—and we are the digital world innovators— we have to get this right.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My right hon. Friend is right to raise this important point. The big area here is not only clickbait, but AI-generated recommendation tools, such as a news feed on Facebook or “next up” on YouTube. Mitigating the illegal content on the platforms is not just about content moderation and removal; it is about not promoting.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

My hon. Friend is exactly right about that. I used the example of clickbait as shorthand. The simple truth is that “AI-generated” is also a misnomer, because these things are not normally AI; they are normally algorithms written specifically to recommend and to maximise returns and revenue. We are not surprised at that. Why should we be? After all, these are commercial companies we are talking about and that is what they are going to do. Every commercial company in the world operates within a regulatory framework that prevents them from making profits out of antisocial behaviour.

Aaron Bell Portrait Aaron Bell (Newcastle-under-Lyme) (Con)
- Hansard - - - Excerpts

On the AI point, let me say that the advances we have seen over the weekend are remarkable. I have just asked OpenAI.com to write a speech in favour of the Bill and it is not bad. That goes to show that the risks to people are not just going to come from algorithms; people are going to be increasingly scammed by AI. We need a Bill that can adapt with the times as we move forward.

David Davis Portrait Mr Davis
- Hansard - - - Excerpts

Perhaps we should run my speech against—[Laughter.] I am teasing. I am coming to the end of my comments, Madam Deputy Speaker. The simple truth is that these mechanisms—call them what you like—are controllable if we put our mind to it. It requires subtlety, testing the thing out in practice and enormous expert input, but we can get this right.

None Portrait Several hon. Members rose—
- Hansard -

Eleanor Laing Portrait Madam Deputy Speaker (Dame Eleanor Laing)
- Hansard - - - Excerpts

It will be obvious to everyone present that a great many Members wish to speak. Although we have a lot of time for this Bill, it is not infinite, and some speeches, so far, have been extremely long. I am trying to manage this without a formal time limit, because the debate flows better without one, but I hope that Members will now limit themselves to around eight minutes. If they do not do so, there will be a formal time limit of less than eight minutes.

John McDonnell Portrait John McDonnell (Hayes and Harlington) (Lab)
- View Speech - Hansard - - - Excerpts

The debate so far has been serious, and it has respected the views that have been expressed not only by Members from across the House, on a whole range of issues, but by the families joining us today who have suffered such a sad loss.

I wish to address one detailed element of the Bill, and I do so in my role as secretary of the National Union of Journalists’ cross-party parliamentary group. It is an issue to which we have returned time and again when we have been debating legislation of this sort. I just want to bring it to the attention of the House; I do not intend to divide the House on this matter. I hope that the Government will take up the issue, and then, perhaps, when it goes to the other place, it will be resolved more effectively than it has been in this place. I am happy to offer the NUJ’s services in seeking to provide a way forward on this matter.

Many investigative journalists base their stories on confidential information, disclosed often by whistleblowers. There has always been an historic commitment—in this House as well—to protect journalists’ right to protect their sources. It has been at the core of the journalists’ code of practice, promoted by the NUJ. As Members know, in some instances, journalists have even gone to prison to protect their sources, because they believe that it is a fundamental principle of journalism, and also a fundamental principle of the role of journalism in protecting our democracy.

The growth in the use of digital technology in journalism has raised real challenges in protecting sources. In the case of traditional material, a journalist has possession of it, whereas with digital technology a journalist does not own or control the data in the same way. Whenever legislation of this nature is discussed, there has been a long-standing, cross-party campaign in the House to seek to protect this code of practice of the NUJ and to provide protection for journalists to protect their sources and their information. It goes back as far as the Police and Criminal Evidence Act 1984. If Members can remember the operation of that Act, they will know that it requires the police or the investigatory bodies to produce a production order, and requires notice to be given to journalists of any attempt to access information. We then looked at it again in the Investigatory Powers Act 2016. Again, what we secured there were arrangements by which there should be prior approval by a judicial commissioner before an investigatory power can seek communications data likely to compromise a journalists’ sources. There has been a consistent pattern.

To comply with Madam Deputy Speaker’s attempt to constrain the length of our speeches, let me briefly explain to Members what amendment 204 would do. It is a moderate probing amendment, which seeks to ask the Government to look again at this matter. When Ofcom is determining whether to issue a notice to intervene or when it is issuing a notice to that tech platform to monitor user-to-user content, the amendment asks it to consider the level of risk of the specified technology accessing, retaining or disclosing the identity of any confidential journalistic source or confidential journalistic material. The amendment stands in the tradition of the other amendments that have been tabled in this House and that successive Government have agreed to. It puts the onus on Ofcom to consider how to ensure that technologies can be limited to the purpose that was intended. It should not result in massive data harvesting operations, which was referred to earlier, or become a back door way for investigating authorities to obtain journalistic data, or material, without official judicial approval.

18:15
David Davis Portrait Mr Davis
- Hansard - - - Excerpts

I rise in support of the right hon. Gentleman. The production order structure, as it stands, is already being abused: I know of a case in place today. The measure should be stronger and clearer—the Bill contains almost nothing on this—on the protection of journalists, whistleblowers and all people for public interest reasons.

John McDonnell Portrait John McDonnell
- Hansard - - - Excerpts

The right hon. Gentleman and I have some form on this matter going back a number of years. The amendment is in the tradition that this House has followed of passing legislation to protect journalists, their sources and their material. I make this offer again to the Minister: the NUJ is happy to meet and discuss how the matter can be resolved effectively through the tabling of an amendment in the other place or discussions around codes of practice. However, I emphasise to the Minister that, as we have found previously, the stronger protection is through a measure in the Bill itself.

Jeremy Wright Portrait Sir Jeremy Wright (Kenilworth and Southam) (Con)
- View Speech - Hansard - - - Excerpts

I rise to speak to amendments 1 to 9 and new clause 1 in my name and the names of other hon. and right hon. Members. They all relate to the process of categorisation of online services, particularly the designation of some user-to-user services as category 1 services. There is some significance in that designation. In the Bill as it stands, perhaps the greatest significance is that only category 1 services have to concern themselves with so-called “legal but harmful” content as far as adults are concerned. I recognise that the Government have advertised their intention to modify the Bill so that users are offered instead mechanisms by which they can insulate themselves from such content, but that requirement, too, would only apply to category 1 services. There are also other obligations to which only category 1 services are subject—to protect content of democratic importance and journalistic content, and extra duties to assess the impact of their policies and safety measures on rights of freedom of expression and privacy.

Category 1 status matters. The Bill requires Ofcom to maintain a register of services that qualify as category 1 based on threshold criteria set out in regulations under schedule 11 of the Bill. As schedule 11 stands, the Secretary of State must make those regulations, specifying threshold conditions, which Ofcom must then apply to designate a service as category 1. That is based only on the number of users of the service and its functionalities, which are defined in clause 189.

Amendments 2 to 8 would replace the word “functionalities” with the word “characteristics”. This term is defined in amendment 1 to include not only functionalities —in other words what can be done on the platform—but other aspects of the service: its user base; its business model; governance and other systems and processes. Incidentally, that definition of the term “characteristics” is already in the Bill in clause 84 dealing with risk profiles, so it is a definition that the Government have used themselves.

Categorisation is about risk, so the amendments ask more of platforms and services where the greatest risk is concentrated; but the greatest risk will not always be concentrated in the functionality of an online service. For example, its user base and business model will also disclose a significant risk in some cases. I suggest that there should be broader criteria available to Ofcom to enable it to categorise. I also argue that the greatest risk is not always concentrated on the platforms with the most users. Amendment 9 would change schedule 11 from its current wording, which requires the meeting of both a scale and a functionality threshold for a service to be designated as category 1, to instead require only one or the other.

Very harmful content being located on smaller platforms is an issue that has been discussed many times in consideration of the Bill. That could arise organically or deliberately, with harmful content migrating to smaller platforms to escape more onerous regulatory requirements. Amendment 9 would resolve that problem by allowing Ofcom to designate a service as category 1 based on its size or on its functionalities—or, better yet, on its broader characteristics.

I do not want to take too many risks, but I think the Government have some sympathy with my position, based on the indicative amendments they have published for the further Committee stage they would like this Bill to have. I appreciate entirely that we are not discussing those amendments today, but I hope, Madam Deputy Speaker, you will permit me to make some brief reference to them, as some of them are on exactly the same territory as my amendments here.

Some of those amendments that the Government have published would add the words “any other characteristics” to schedule 11 provisions on threshold conditions for categorisation, and define them in a very similar way to my amendment 1. They may ask whether that will answer my concerns, and the answer is, “Nearly.” I welcome the Government’s adding other characteristics to the consideration, not just of threshold criteria, but to the research Ofcom will carry out on how threshold conditions will be set in the first place, but I am afraid that they do not propose to change schedule 11, paragraph 1(4), which requires regulations made on threshold conditions to include,

“at least one specified condition about number of users and at least one specified condition about functionality.”

That means that to be category 1, a service must still be big.

I ask the Minister to consider again very carefully a way in which we can meet the genuine concern about high harm on small platforms. The amendment that he is likely to bring forward in Committee will not yet do so comprehensively. I also observe in passing that the reference the Government make in those amendments to any other characteristics are those that the Secretary of State considers relevant, not that Ofcom considers relevant—but that is perhaps a conversation for another day.

Secondly, I come on to the process of re-categorisation and new clause 1. It is broadly agreed in this debate that this is a fast-changing landscape; platforms can grow quickly, and the nature and scale of the content on them can change fast as well. If the Government are wedded to categorisation processes with an emphasis on scale, then the capacity to re-categorise a platform that is now category 2B but might become category 1 in the future will be very important.

That process is described in clause 83 of the Bill, but there are no timeframes or time limits for the re-categorisation process set out. We can surely anticipate that some category 2B platforms might be reluctant to take on the additional applications of category 1 status, and may not readily acquiesce in re-categorisation but instead dispute it, including through an appeal to the tribunal provided for in clause 139. That would mean that re-categorisation could take some time after Ofcom has decided to commence it and communicate it to the relevant service. New clause 1 is concerned with what happens in the meantime.

To be clear, I would not expect the powers that new clause 1 would create to be used often, but I can envisage circumstances where they would be beneficial. Let us imagine that the general election is under way—some of us will do that with more pleasure than others. Category 1 services have a particular obligation to protect content of democratic importance, including of course by applying their systems and processes for moderating content even-handedly across all shades of political opinion. There will not be a more important time for that obligation than during an election.

Let us assume also that a service subject to ongoing re-categorisation, because in Ofcom’s opinion it now has considerable reach, is not applying that even-handedness to the moderation of content or even to its removal. Formal re-categorisation and Ofcom powers to enforce a duty to protect democratic content could be months away, but the election will be over in weeks, and any failure to correct disinformation against a particular political viewpoint will be difficult or impossible to fully remedy by retrospective penalties at that point.

New clause 1 would give Ofcom injunction-style powers in such a scenario to act as if the platform is a category 1 service where that is,

“necessary to avoid or mitigate significant harm.”

It is analogous in some ways to the powers that the Government have already given to Ofcom to require a service to address a risk that it should have identified in its risk assessment but did not because that risk assessment was inadequate, and to do so before the revised risk assessment has been done.

Again, the Minister may say that there is an answer to that in a proposed Committee stage amendment to come, but I think the proposal that is being made is for a list of emerging category 1 services—those on a watchlist, as it were, as being borderline category 1—but that in itself will not speed up the re-categorisation process. It is the time that that process might take that gives rise to the potential problem that new clause 1 seeks to address.

I hope that my hon. Friend the Minister will consider the amendments in the spirit they are offered. He has probably heard me say before—though perhaps not, because he is new to this, although I do not think anyone else in the room is—that the right way to approach this groundbreaking, complex and difficult Bill is with a degree of humility. That is never an easy sell in this institution, but I none the less think that if we are prepared to approach this with humility, we will all accept, whether Front Bench or Back Bench, Opposition or Government, that we will not necessarily get everything right first time.

Therefore, these Report stages in this Bill of all Bills are particularly important to ensure that where we can offer positive improvements, we do so, and that the Government consider them in that spirit of positive improvement. We owe that to this process, but we also owe it to the families who have been present for part of this debate, who have lost far more than we can possibly imagine. We owe it to them to make sure that where we can make the Bill better, we make it better, but that we do not lose the forward momentum that I hope it will now have.

Neale Hanvey Portrait Neale Hanvey (Kirkcaldy and Cowdenbeath) (Alba)
- View Speech - Hansard - - - Excerpts

I approach my contribution from the perspective of the general principle, the thread that runs through all the amendments on the paper today on safety, reform of speech, illegal content and so on. That thread is how we deal with the harm landscape and the real-world impact of issues such as cyber-bullying, revenge porn, predatory grooming, self-harm or indeed suicide forums.

There is a serious risk to children and young people, particularly women and girls, on which there has been no debate allowed: the promulgation of gender ideology pushed by Mermaids and other so-called charities, which has created a toxic online environment that silences genuine professional concern, amplifies unquestioned affirmation and brands professional therapeutic concern, such as that of James Esses, a therapist and co-founder of Thoughtful Therapists, as transphobic. That approach, a non-therapeutic and affirmative model, has been promoted and fostered online.

The reality is that adolescent dysphoria is a completely normal thing. It can be a response to disruption from adverse childhood experiences or trauma, it can be a feature of autism or personality disorders or it can be a response to the persistence of misogynistic social attitudes. Dysphoria can present and manifest in many different ways, not just gender. If someone’s gender dysphoria persists even after therapeutic support, I am first in the queue to defend that person and ensure their wishes are respected and protected, but it is an absolute falsity to give young people information that suggests there is a quick-fix solution.

It is not normal to resolve dysphoria with irreversible so-called puberty blockers and cross-sex hormones, or with radical, irreversible, mutilating surgery. Gender ideology is being reinforced everywhere online and, indeed, in our public services and education system, but it is anything but progressive. It attempts to stuff dysphoric or gender non-conforming young people into antiquated, regressive boxes of what a woman is and what a man is, and it takes no account of the fact that it is fine to be a butch or feminine lesbian, a femboy or a boy next door, an old duffer like me, an elite gay sportsman or woman, or anything in between.

18:36
Transitioning will be right for some, but accelerating young people into an affirmative model is absolutely reckless. What do those who perpetuate this myth want to achieve? What is in it for them? Those are fundamental questions that we have to ask. The reality is that the affirmative model is the true conversion therapy—trans-ing away the gay and nullifying same-sex attraction.
I urge all right hon. and hon. Members to watch the four-part documentary “Dysphoric” on YouTube. It is so powerful and shows the growing number of young people who have been transitioned rapidly into those services, and the pain, torment and regret that they have experienced through the irreversible effects of their surgery and treatments. The de-transitioners are bearing the impacts. There is no follow-up to such services, and those people are just left to get on with it. Quite often, their friends in the trans community completely abandon them when they detransition.
I pay particular tribute to Sinead Watson and Ritchie Herron, who are both de-transitioners, for their courage and absolutely incredible resilience in dealing with this issue online and shining a light on this outrage. I also pay tribute to the LGB Alliance, For Women Scotland, and Sex Matters, which have done a huge amount of work to bring this matter to the fore.
Mermaids—the organisation—continues to deny that there is any harm, co-morbidities or serious iatrogenic impacts from hormone treatment or radical surgery. That is a lie; it is not true. Mermaids has promoted the illegal availability of online medicines that do lasting, irreversible damage to young people.
I pay tribute to the Government for the Cass review, which is beginning to shine a light on the matter. I welcome the interim report, but we as legislators must make a connection between what is happening online, how it is policed in society and the message that is given out there. We must link harm to online forums and organisations, as well as to frontline services.
I point out with real regret that I came across a document being distributed through King’s College Hospital NHS Foundation Trust from an organisation called CliniQ, which runs an NHS clinic for the trans community. The document has lots of important safety and health advice, but it normalises self-harm as sexual
“Play that involves blood, cutting and piercing.”
It advises that trans-identifying females can go in
“stealth if it is possible for them”
to private gay clubs, and gives examples of how to obtain sex by deception. It is unacceptable that such information is provided on NHS grounds.
Speaking out about this in Scotland has been a very painful experience for many of us. We have faced doxing, threats, harassment and vilification. In 2019, I raised my concerns about safeguarding with my colleagues in Government. A paper I wrote had this simple message: women are not being listened to in the gender recognition reform debate. I approached the then Cabinet Secretary for Social Security and Older People, Shirley-Anne Somerville, whose brief included equality. She was someone I had known for years and considered a friend; she knew my professional background, my family and, of course, my children. She told me she that she shared my concerns—she has children of her own—but she instructed me to be silent. She personally threatened and attempted to bully friends of mine, insisting that they abandon me. I pay great tribute to Danny Stone and the Antisemitism Policy Trust for their support in guiding me through what was an incredibly difficult period of my life. I also pay tribute to the hon. Member for Brigg and Goole (Andrew Percy).
I can see that you are anxious for me close, Madam Deputy Speaker, so I will—[Interruption.] I will chance my arm a bit further, then.
I am not on my pity pot here; this is not about me. It is happening all over Scotland. Women in work are being forced out of employment. If Governments north and south of the border are to tackle online harms, we must follow through with responsible legislation. Only last week, the First Minister of Scotland, who denied any validity to the concerns I raised in 2019, eventually admitted they were true. But her response must be to halt her premature and misguided legislation, which is without any protection for the trans community, women or girls. We must make the connection from online harms all the way through to meaningful legislation at every stage.
Maria Miller Portrait Dame Maria Miller
- View Speech - Hansard - - - Excerpts

I rise to speak to the seven new clauses in my name and those of right hon. and hon. Members from across the House. The Government have kindly said publicly that they are minded to listen to six of the seven amendments that I have tabled on Report. I hope they will listen to the seventh, too, once they have heard my compelling arguments.

First, I believe it is important that we discuss these amendments, because the Government have not yet tabled amendments. It is important that we in this place understand the Government’s true intention on implementing the Law Commission review in full before the Bill completes its consideration.

Secondly, the law simply does not properly recognise as a criminal offence the posting online of intimate images—whether real or fake—without consent. Victims say that having a sexual image of them posted online without their consent is akin to a sexual assault. Indeed, Clare McGlynn went even further by saying that there is a big difference between a physical sexual assault and one committed online: victims are always rediscovering the online images and waiting for them to be redistributed, and cannot see when the abuse will be over. In many ways, it is even more acute.

Just in case anybody in the Chamber is unaware of the scale of the problem after the various contributions that have been made, in the past five years more than 12,000 people reported to the revenge porn helpline almost 200,000 pieces of content that fall into that category. Indeed, since 2014 there have been 28,000 reports to the police of intimate images being distributed without consent.

The final reason why I believe it is important that we discuss the new clauses is that Ofcom will be regulating online platforms based on their adherence to the criminal law, among other things. It is so important that the criminal law actually recognises where criminal harm is done, but at the moment, when it comes to intimate image abuse, it does not. Throughout all the stages of the Bill’s passage, successive Ministers have said very positive things to me about the need to address this issue in the criminal law, but we still have not seen pen being put to paper, so I hope the Minister will forgive me for raising this yet again so that he can respond.

New clauses 45 to 50 simply seek to take the Law Commission’s recommendations on intimate image abuse and put them into law as far as the scope of the Bill will allow. New clause 45 would create a base offence for posting explicit images online without consent. Basing the offence on consent, or the lack of it, makes it comparable with three out of four offences already recognised in the Sexual Offences Act 2003. Subsection (10) of the new clause recognises that it is a criminal offence to distribute fake images, deepfakes or images using nudification software, which are currently not covered in law at all.

New clauses 46 and 47 recognise cases where there is a higher level of culpability for the perpetrator, where they intend to cause alarm, distress or humiliation. Two in three victims report that they know the perpetrators, as a current or former partner. In evidence to the Public Bill Committee, on which I was very pleased to serve, we heard from the Anjelou Centre and Imkaan that some survivors of this dreadful form of abuse are also at risk of honour-based violence. There are yet more layers of abuse.

New clause 48 would make it a crime to threaten to share an intimate image—this can be just as psychologically destructive as actually sharing it—and using the image to coerce, control or manipulate the victim. I pay real tribute to the team from the Law Commission, under the leadership of Penney Lewis, who did an amazing job of work over three years on their enquiry to collect this information. In the responses to the enquiry there were four mentions of suicide or contemplated suicide as a result of threats to share these sorts of images online without consent. Around one in seven young women and one in nine young men have experienced a threat to share an intimate or sexual image. One in four calls to the Revenge Porn Helpline relate to threats to share. The list of issues goes on. In 2020 almost 3,000 people, mostly men, received demands for money related to sexual images—“sextorsion”, as it is called. This new clause would make it clear that such threats are criminal, the police need to take action and there will be proper protection for victims in law.

New clauses 49 and 50 would go further. The Law Commission is clear that intimate image abuse is a type of sexual offending. Therefore, victims should have the same protection afforded to those of other sexual offences. That is backed up by the legal committee of the Council of His Majesty’s District Judges, which argues that it is appropriate to extend automatic lifetime anonymity protections to victims, just as they would be extended to victims of offences under the Modern Slavery Act 2015. Women’s Aid underlined that point, recognising that black and minoritised women are also at risk of being disowned, ostracised or even killed if they cannot remain anonymous. The special measures in these new clauses provide for victims in the same way as the Domestic Abuse Act 2021.

I hope that my hon. Friend the Minister can confirm that the Government intend to introduce the Law Commission’s full recommendations into the Bill, and that those in scope will be included before the Bill reaches its next stage in the other place. I also hope that he will outline how those measures not in scope of the Bill—specifically on the taking and making of sexual images without consent, which formed part of the Law Commission’s recommendations—will be addressed in legislation swiftly. I will be happy to withdraw my new clauses if those undertakings are made today.

Finally, new clause 23, which also stands in my name, is separate from the Law Commission’s recommendations. It would require a proportion of the fines secured by Ofcom to be used to fund victims’ services. I am sure that the Treasury thinks that it is an innovative way of handling things, although one could argue that it did something similar only a few days ago with regard to the pollution of waterways by water companies. I am sure that the Minister might want to refer to that.

The Bill identifies that many thousands more offences are committed as crimes than are currently recognised within law. I hope that the Minister can outline how appropriate measures will be put in place to ensure support for victims, who will now, possibly for the first time, have some measures in place to assist them. I raised earlier the importance of keeping the Bill and its effectiveness under review. I hope that the House will think about how we do that materially, so we do not end up having another five or 10 years without such a Bill and having to play catch-up in such a complex area.

18:45
Matt Rodda Portrait Matt Rodda (Reading East) (Lab)
- View Speech - Hansard - - - Excerpts

I am grateful to have the opportunity to speak in this debate. I commend the right hon. Member for Basingstoke (Dame Maria Miller) on her work in this important area. I would like to focus my remarks on legal but harmful content and its relationship to knife crime, and to mention a very harrowing and difficult constituency case of mine. As we have heard, legal but harmful content can have a truly dreadful effect. I pay tribute to the families of the children who have been lost, who have attended the debate, a number of whom are still in the Public Gallery.

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

Just to be clear, the hon. Gentleman’s speech must relate to the amendments before us today.

Matt Rodda Portrait Matt Rodda
- Hansard - - - Excerpts

Thank you, Madam Deputy Speaker. A boy called Olly Stephens in my constituency was just 13 years old when he was stabbed and brutally murdered in an attack linked to online bullying. He died, sadly, very near his home. His parents had little idea of the social media activity in his life. It is impossible to imagine what they have been through. Our hearts go out to them.

Harmful but legal content had a terrible effect on the attack on Olly. The two boys who attacked and stabbed him had been sharing enormous numbers of pictures and videos of knives, repeatedly, over a long period of time. There were often videos of teenagers playing with knives, waving them or holding them. They circulated them on 11 different social media platforms over a long period of time. None of those platforms took any action to take the content down. We all need to learn more about such cases to fully understand the impact of legal but harmful content. Even at this late stage, I hope that the Government will think again about the changes they have made to the Bill and include this area again in the Bill.

There is a second aspect of this very difficult case that I want to mention: the fact that Olly’s murder was discussed on social media and was planned to some extent beforehand. The wider issues here underline the need for far greater regulation and moderation of social media, in particular teenagers’ use of these powerful sites. I am finding it difficult to talk about some of these matters, but I hope that the Government will take my points on board and address the issue of legal but harmful content, and that the Minister will think again about these important matters. Perhaps we will have an opportunity to discuss it in the Bill’s later stages.

Adam Afriyie Portrait Adam Afriyie
- View Speech - Hansard - - - Excerpts

I am pleased to follow my fairly close neighbour from Berkshire, the hon. Member for Reading East (Matt Rodda). He raised the issue of legal but harmful content, which I will come to, as I address some of the amendments before us.

I very much welcome the new shape and focus of the Bill. Our primary duty in this place has to be to protect children, above almost all else. The refocusing of the Bill certainly does that, and it is now in a position where hon. Members from all political parties recognise that it is so close to fulfilling its function that we want it to get through this place as quickly as possible with today’s amendments and those that are forthcoming in the Lords and elsewhere in future weeks.

The emerging piece of legislation is better and more streamlined. I will come on to further points about legal but harmful, but I am pleased to see that removed from the Bill for adults and I will explain why, given the sensitive case that the hon. Member for Reading East mentioned. The information that he talked about being published online should be illegal, so it would be covered by the Bill. Illegal information should not be published and, within the framework of the Bill, would be taken down quickly. We in this place should not shirk our responsibilities; we should make illegal the things that we and our constituents believe to be deeply harmful. If we are not prepared to do that, we cannot say that some other third party has a responsibility to do it on our behalf and we are not going to have anything to do with it, and they can begin to make the rules, whether they are a commercial company or a regulator without those specific powers.

I welcome the shape of the Bill, but some great new clauses have been tabled. New clause 16 suggests that we should make it an offence to encourage self-harm, which is fantastic. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) has indicated that he will not press it to a vote, because the Government and all of us acknowledge that that needs to be dealt with at some point, so hopefully an amendment will be forthcoming in the near future.

On new clause 23, it is clear that if a commercial company is perpetrating an illegal act or is causing harm, it should pay for it, and a proportion of that payment must certainly support the payments to victims of that crime or breach of the regulations. New clauses 45 to 50 have been articulately discussed by my right hon. Friend the Member for Basingstoke (Dame Maria Miller). The technology around revenge pornography and deepfakes is moving forward every day. With some of the fakes online today, it is not possible to tell that they are fakes, even if they are looked at under a microscope. Those areas need to be dealt with, but it is welcome that she will not necessarily press the new clauses to a vote, because those matters must be picked up and defined in primary legislation as criminal acts. There will then be no lack of clarity and we will not need the legal but harmful concept—that will not need to exist. Something will either be illegal, because it is harmful, or not.

The Bill is great because it provides a framework that enables everything else that hon. Members in the House and people across the country may want to be enacted at a future date. It also enables the power to make those judgments to remain with this House—the democratically elected representatives of the people—rather than some grey bureaucratic body or commercial company whose primary interest is rightly to make vast sums of money for its shareholders. It is not for them to decide; it is for us to decide what is legal and what should be allowed to be viewed in public.

On amendment 152, which interacts with new clause 11, I was in the IT industry for about 15 to 20 years before coming to this place, albeit with a previous generation of technology. When it comes to end-to-end encryption, I am reminded of King Canute, who said, “I’m going to pass a law so that the tide doesn’t come in.” Frankly, we cannot pass a law that bans mathematics, which is effectively what we would be trying to do if we tried to ban encryption. The nefarious types or evildoers who want to hide their criminal activity will simply use mathematics to do that, whether in mainstream social media companies or through a nefarious route. We have to be careful about getting rid of all the benefits of secure end-to-end encryption for democracy, safety and protection from domestic abuse—all the good things that we want in society—on the basis of a tiny minority of very bad people who need to be caught. We should not be seeking to ban encryption; we should be seeking to catch those criminals, and there are ways of doing so.

I welcome the Bill; I am pleased with the new approach and I think it can pass through this House swiftly if we stick together and make the amendments that we need. I have had conversations with the Minister about what I am asking for today: I am looking for an assurance that the Government will enable further debate and table the amendments that they have suggested. I also hope that they will be humble, as my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said, and open to some minor adjustments, even to the current thinking, to make the Bill pass smoothly through the Commons and the Lords.

I would like the Government to confirm that it is part of their vision that it will be this place, not a Minister of State, that decides every year—or perhaps every few months, because technology moves quickly—what new offences need to be identified in law. That will mean that Ofcom and the criminal justice system can get on to that quickly to ensure that the online world is a safer place for our children and a more pleasant place for all of us.

None Portrait Several hon. Members rose—
- Hansard -

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - - - Excerpts

Order. Just a quick reminder: I know it is extremely difficult, and I do not want to interrupt hon. Members when they are making their speeches, but it is important that we try to address the amendments that are before us today. There will be a separate debate on whether to recommit the Bill and on the other ideas, so they can be addressed at that point. As I say, it is important to relate remarks to the amendments that are before us.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- View Speech - Hansard - - - Excerpts

I apologise for having left the debate for a short time; I had committed to speaking to a room full of young people about the importance of political education, which felt like the right thing to do, given the nature of the debate and the impact that the Bill will have on our young people.

I am extremely relieved that we are continuing to debate the Bill, despite the considerable delays that we have seen; as I mentioned in this House previously, it is long overdue. I acknowledge that it is still groundbreaking in its scope and extremely important, but we must now ensure that it works, particularly for children and vulnerable adults, and that it goes some way to cleaning up the internet for everyone by putting users first and holding platforms to account.

On new clause 53, I put on record my thanks to the Government for following through with their commitments to me in Committee to write Zach’s law in full into the Bill. My constituent Zach Eagling and his mum Clare came into Parliament a few weeks ago, and I know that hon. Members from both sides of the House were pleased to meet him to thank him for his incredible campaign to make the vile practice of epilepsy trolling completely illegal, with a maximum penalty of a five-year prison sentence. The inspirational Zach, his mum and the Epilepsy Society deserve enormous praise and credit for their incredible campaign, which will now protect the 600,000 people living with epilepsy in the UK. I am delighted to report that Zach and his mum have texted me to thank all hon. Members for their work on that.

I will raise three areas of particular concern with the parts of the Bill that we are focusing on. First, on director liability, the Bill includes stiff financial penalties for platforms that I hope will force them to comply with these regulations, but until the directors of these companies are liable and accountable for ensuring that their platforms comply and treat the subject with the seriousness it requires, I do not believe that we will see the action needed to protect children and all internet users.

Ultimately, if platforms enforce their own terms and conditions, remove illegal content and comply with the legal but harmful regulations—as they consistently tell us that they will—they have nothing to worry about. When we hear the stories of harm committed online, however, and when we hear from the victims and their families about the devastation that it causes, we must be absolutely watertight in ensuring that those who manage and operate the platforms take every possible step to protect every user on their platform.

We must ensure that, to the directors of those companies, this is a personal commitment as part of their role and responsibility. As we saw with health and safety regulations, direct liability is the most effective way to ensure that companies implement such measures and are scrupulous in reviewing them. That is why I support new clause 17 and thank my right hon. Friend the Member for Barking (Dame Margaret Hodge) for her tireless and invaluable work on this subject.

Let me turn to media literacy—a subject that I raised repeatedly in Committee. I am deeply disappointed that the Government have removed the media literacy duty that they previously committed to introducing. Platforms can boast of all the safety tools they have to protect users, talk about them in meetings, publicise them in press releases and defend them during Committee hearings, but unless users know that they are there and know exactly how to use them, and unless they are being used, their existence is pointless.

19:04
Ofcom recently found that more than a third of children aged eight to 17 said they had seen something “worrying or nasty” online in the past 12 months, but only a third of children knew how to use online reporting or flagging functions. Among adults, a third of internet users were unaware of the potential for inaccurate or biased information online, and just over a third made no appropriate checks before registering their personal details online. Clearly, far more needs to be done to ensure that internet users of all ages are aware of online dangers and of the tools available to keep them safe.
Although programmes such as Google’s “Be Internet Legends” assemblies are a great resource in schools—I was pleased to visit one at Park Road Junior Infant and Nursery School in Batley recently—we cannot rely on platforms to do this themselves. We have had public information campaigns on the importance of wearing seatbelts, and on the dangers of drink-driving and smoking, and the digital world is now one of the largest dangers most people face in their daily lives. The public sector clearly has a role to warn of the dangers and promote healthy digital habits.
Let me give one example from the territory of legal but harmful content, which members have spoken about as opaque, challenging and thorny. I agree with all those comments, but if platforms have a tool within them that switches off legal but harmful content, it strikes me as incredibly important that users know what that tool does—that is, they know what information they may be subjected to if it is switched on, and they know exactly how to turn it off. Yet I have heard nothing from the Government since their announcement last week that suggests they will be taking steps to ensure that this tool is easily accessible to users of all ages and digital abilities, and that is exactly why there is a need for a proper digital media literacy strategy.
I therefore support new clauses 29 and 30, tabled by my colleagues in the SNP, which would empower Ofcom to publish a strategy at least every three years that sets out the measures it is taking to promote media literacy among the public, including through educational initiatives and by ensuring that platforms take the steps needed to make their users aware of online safety tools.
Finally, I turn to the categorisation of platforms under part 7 of the Bill. I feel extremely strongly about this subject and agree with many comments made by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). The categorisation system listed in the Bill is not fit for purpose. I appreciate that categorisation is largely covered in part 3 and schedule 10, but amendment 159, which we will be discussing in Committee, and new clause 1, which we are discussing today, are important steps towards addressing the Government’s implausible position—that the size of a platform equates to the level of risk. As a number of witnesses stated in Committee, that is simply not the case.
It is completely irresponsible and narrow-minded to believe that there are no blind spots in which small, high-risk platforms can fester. I speak in particular about platforms relating to dangerous, extremist content —be it Islamist, right wing, incel or any other. These platforms, which may fall out of the scope of the Bill, will be allowed to continue to host extremist individuals and organisations, and their deeply dangerous material. I hope the Government will urgently reconsider that approach, as it risks inadvertently pushing people, including young people, towards greater harm online—either for individuals or for society as a whole.
Although I am pleased that the Bill is back before us today, I am disappointed that aspects have been weakened since we last considered it, and urge the Government to consider closely some proposals we will vote on this evening, which would go a considerable way to ensuring that the online world is a safer place for children and adults, works in the interests of users, and holds platforms accountable and responsible for protecting us all online.
John Penrose Portrait John Penrose
- View Speech - Hansard - - - Excerpts

It is a pleasure to follow Zach’s MP, the hon. Member for Batley and Spen (Kim Leadbeater). I particularly want to pick up on her final comments about the difficulties of platforms—not just small platforms, but larger ones—hosting extremist content, be it incels, the alt-right, the radical left or any other kind.

I will speak to my new clauses 34 and 35, which seek to deal with both disinformation and misinformation. They are important amendments, because although the Bill has taken huge steps forward—we are led to believe that it may take a couple more in due course when the revised version comes back if the recommittal is passed—there are still whole categories of harm that it does not yet address. In particular, it focuses, rightly and understandably, on individual harms to children and illegal activities as they relate to adults, but it does not yet deal with anything to do with collective harms to our society and our democracy, which matter too.

We have heard from former journalists in this debate. Journalists know it takes time and money to come up with a properly researched, authoritatively correct, accurate piece of journalism, but it takes a fraction of that time and cost to invent a lie. A lie will get halfway around the world before the truth has got its boots on, as the saying rightly goes. Incidentally, the hon. Member for Rotherham (Sarah Champion) said that it is wonderful that we are all learning so much. I share that sentiment; it is marvellous that we are all comparing and sharing our particular areas of expertise.

One person who seems to have all areas of expertise under his belt is my hon. Friend the Member for Folkestone and Hythe (Damian Collins), who chaired the Joint Committee. He rightly pointed out that this is a systems Bill, and it therefore deals with trying to prevent some things from happening—and yet it is completely silent on misinformation and disinformation, and their effect on us collectively, as a society and a democracy. New clauses 34 and 35 are an attempt to begin to address those collective harms alongside some individual harms we face. One of them deals with a duty of balance; the other deals with factual accuracy.

The duty of balance is an attempt to address the problem as it relates to filter bubbles, because this is a systems Bill and because each of us has a tailored filter bubble, by which each of the major platforms, and some of the minor ones, work out what we are interested in and feed us more of the same. That is fine for people who are interested in fishing tackle; that is super. But if someone is interested in incels and they get fed more and more incel stuff, or they are vaguely left wing and get taken down a rabbit hole into the increasingly radical left—or alternatively alt-right, religious extremism or whatever it may be—pretty soon they get into echo chambers, and from echo chambers they get into radicalisation, and from radicalisation they can pretty soon end up in some very murky, dark and deep waters.

There are existing rules for other old-world broadcasters; the BBC, ITV and all the other existing broadcasters have a duty of balance and undue prominence imposed on them by Ofcom. My argument is that we should consider ways to impose a similar duty of balance on the people who put together the programs that create our own individual filter bubbles, so that when someone is shown an awful lot of stuff about incels, or alt-right or radical left politics, somewhere in that filter bubble they will be sent something saying, “You do know that this is only part of the argument, don’t you? Do you know that there is another side to this? Here’s the alternative; here’s the balancing point.” We are not doing that at the moment, which is one of the reasons we have an increasingly divided societal and political debate, and that our public square as a society is becoming increasingly more fractious—and dangerous, in some cases. New clause 35 would fix that particular problem.

New clause 34 would deal with the other point—the fact that a lie will get halfway around the world before the truth has got its boots on. It tries to deal with factual accuracy. Factual accuracy is not quite the same thing as truth. Truth is an altogether larger and more philosophical concept to get one’s head around. It is how we string together accurate and correct facts to create a narrative or an explanation. Factual accuracy is an essential building block for truth. We must at least try to ensure that we can all see when someone has made something up or invented something, whether it is that bleach is a good way to cure covid or whatever. When somebody makes something up, we need to know and it needs to be clear. In many cases that is clear, but in many cases, if it is a plausible lie, a deepfake or whatever it may be, it is not clear. We need to be able to see that easily, quickly and immediately, and say, “I can discount this, because I know that the person producing it is a serial liar and tells huge great big porkies, and I shouldn’t be trusting what they are sending me, or I can see that the actual item itself is clearly made up.”

The duty of achieving balance already exists in rules and law in other parts of our society and is tried and tested—it has stood us very well and done a good job for us for 40 or 50 years, since TV and radio became ubiquitous—and the same is true, although not for quite such a long time, for factual accuracy. There are increasingly good methods of checking the factual accuracy of individual bits of content, and if necessary, in some cases of doing so in real time, too. For example, Adobe is leading a very large global grouping producing something called the Content Authenticity Initiative, which can tell if something is a deepfake, because it has an audit trail of where the image, the item or whatever it may be came from and how it has been updated, modified or changed during the course of its life.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

On that point, I want to raise the work that my hon. Friend the Member for Bosworth (Dr Evans), who is not in the Chamber at the moment, has done on body image. When images are photo-shopped and changed to give an idea of beauty that is very different from what is possible in the real world, that very much falls into the idea of truth. What are my hon. Friend’s thoughts on that point?

John Penrose Portrait John Penrose
- Hansard - - - Excerpts

Addressing that is absolutely essential. That goes for any of the deepfake examples we have heard about, including from my right hon. Friend the Member for Basingstoke (Dame Maria Miller), because if we know that something has been changed—and the whole point about deepfake is that it is hard to tell—we can tell easily and say, “I know that is not right, I know that is not true, I know that is false, and I can aim away from it and treat it accordingly”.

Just to make sure that everybody understands, this is not some piece of new tech magic; it is already established. Adobe, as I have said, is doing it with the Content Authenticity Initiative, which is widely backed by other very serious tech firms. Others in the journalism world are doing the same thing, with the Journalism Trust Initiative. There is NewsGuard, which produces trust ratings; the Trust Project, which produces trust indicators; and we of course have our own press regulators in this country, the Independent Press Standards Organisation and IMPRESS.

I urge the Government and all here present not to be satisfied with where this Bill stands now. We have all heard how it can be improved. We have all heard that this is a new, groundbreaking and difficult area in which many other countries have not even got as far as we have, but we should not be in any way satisfied with where we are now. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) said earlier that we need to approach this Bill in a spirit of being humble, and this is an area in which humility is absolutely essential. I hope all of us realise how much further we have to go, and I hope the Minister will say how he proposes to address these important and so far uncovered issues in due course.

Liz Twist Portrait Liz Twist (Blaydon) (Lab)
- View Speech - Hansard - - - Excerpts

I wish to address new clauses 16 and 28 to 30, and perhaps make a few passing comments on some others along the way. Many others who, like me, were in the Chamber for the start of the debate will I suspect feel like a broken record, because we keep revisiting the same issues and raising the same points again and again, and I am going to do exactly that.

First, I will speak about new clause 16, which would create a new offence of encouraging or assisting serious self-harm. I am going to do so because I am the chair of the all-party parliamentary group on suicide and self-harm prevention, and we have done a good deal of work on looking at the issue of self-harm and young people in the last two years. We know that suicide is the leading cause of death in men aged under 50 years and females aged under 35 years, with the latest available figures confirming that 5,583 people in England and Wales tragically took their own lives in 2021. We know that self-harm is a strong risk factor for future suicidal ideation, so it is really important that we tackle this issue.

The internet can be an invaluable and very supportive place for some people who are given the opportunity to access support, but for other people it is difficult. The information they see may provide access to content that acts to encourage, maintain or exacerbate self-harm and suicidal behaviours. Detailed information about methods can also increase the likelihood of imitative and copycat suicide, with risks such as contagion effects also present in the online environment.

19:09
Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- Hansard - - - Excerpts

I pay tribute to my hon. Friend for the work she has done. She will be aware of the case of my constituent Joe Nihill, who at the age of 23 took his own life after accessing suicide-related material on the internet. Of course, we fully support new clause 16 and amendment 159. A lot of content about suicide is harmful, but not illegal, so does my hon. Friend agree that what we really need is assurances from the Minister that, when this Bill comes back, it will include protections to ensure that adults such as Joe, who was aged 23, and adults accessing these materials through smaller platforms are fully protected and get the protection they really need?

Liz Twist Portrait Liz Twist
- Hansard - - - Excerpts

I thank my hon. Friend for those comments, and I most definitely agree with him. One of the points we should not lose sight of is that his constituent was 23 years of age—not a child, but still liable to be influenced by the material on the internet. That is one of the points we need to take forward.

It is really important that we look at the new self-harm offence to make sure that this issue is addressed. That is something that the Samaritans, which I work with, has been campaigning for. The Government have said they will create a new offence, which we will discuss at a future date, but there is real concern that we need to address this issue as soon as possible through new clause 16. I ask the Minister to comment on that so that we can deal with the issue of self-harm straightaway.

I now want to talk about internet and media literacy in relation to new clauses 29 and 30. YoungMinds, which works with young people, is supported by the Royal College of Psychiatrists, the British Psychological Society and the Mental Health Foundation in its proposals to promote the public’s media literacy for both regulated user-to-user services and search services, and to create a strategy to do this. Young people, when asked by YoungMinds what they thought, said they wanted the Online Safety Bill to include a requirement for such initiatives. YoungMinds also found that young people were frustrated by very broad, generalised and outdated messages, and that they want much more nuanced information—not generalised fearmongering, but practical ways in which they can address the issue. I do hope that the Government will take that on board, because if people are to be protected, it is important that we have a more sophisticated media literacy than is reflected in the broad messages we sometimes get at present.

On new clause 28, I do believe there is a need for advocacy services to be supported by the Government to assist and support young people—not to take responsibilities away from them, but to assist and protect them. I want to make two other points. I see that the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) has left the Chamber again, but he raised an interesting and important point about the size of platforms covered by the Bill. I believe the Bill needs to cover those smaller or specialised platforms that people might have been pushed on to by changes to the larger platforms. I hope the Government will address that important issue in future, together with the issue of age, so that protection does not stop just with children, and we ensure that others who may have vulnerabilities are also protected.

I will not talk about “legal but harmful” because that is not for today, but there is a lot of concern about those provisions, which we thought were sorted out and agreed on, suddenly being changed. There is a lot of trepidation about what might come in future, and the Minister must understand that we will be looking closely at any proposed changes.

We have been talking about this issue for many years—indeed, since I first came to the House—and during the debate I saw several former Ministers and Secretaries of State with whom I have raised these issues. It is about time that we passed the Bill. People out there, including young people, are concerned and affected by these issues. The internet and social media are not going to stop because we want to make the Bill perfect. We must ensure that we have something in place. The legislation might be capable of revision in future, but we need it now for the sake of our young people and other vulnerable people who are accessing online information.

Suzanne Webb Portrait Suzanne Webb
- View Speech - Hansard - - - Excerpts

This is the first time I have been able to speak in the Chamber for some time, due to a certain role I had that prevented me from speaking in here. It is an absolute honour and privilege, on my first outing in some time, to have the opportunity to speak specifically to new clause 53, which is Zach’s law. I am delighted and thrilled that the Government are supporting Zach’s law. I have supported it for more than two years, together with my hon. Friend the Member for Watford (Dean Russell). We heard during the Joint Committee on the Draft Online Safety Bill how those who suffer from epilepsy were sent flashing images on social media by vile trolls. Zach Eagling, whom the law is named after, also has cerebral palsy, and he was one of those people. He was sent flashing images after he took part in a charity walk around his garden. He was only nine years of age.

Zach is inspirational. He is selflessly making a massive difference, and the new clause is world-leading. It is down to Zach, his mum, the UK Epilepsy Society, and of course the Government, that I am able to stand here to talk about new clause 53. I believe that the UK Epilepsy Society is the only charity in the world to change the law on any policy area, and that is new clause 53, which is pretty ground-breaking. I say thank you to Zach and the Epilepsy Society, who ensured that I and my hon. Friend the Member for Watford stepped up and played our part in that.

Being on the Joint Committee on the Draft Online Safety Bill was an absolute privilege, with the excellent chairmanship of my hon. Friend the Member for Folkestone and Hythe (Damian Collins). People have been talking about the Bill’s accompanying Committee, which is an incredibly good thing. In the Joint Committee we talked about this: we should follow the Bill through all its stages, and also once it is on the statute books, to ensure that it keeps up with those tech companies. The Joint Committee was brought together by being focused on a skill set, and on bringing together the right skills. I am a technological luddite, but I brought my skills and understanding of audit and governance. My hon. Friend the Member for Watford brought technology and all his experience from his previous day job. As a result we had a better Bill by having a mix of experience and sharing our expertise.

This Bill is truly world leading. New clause 53 is one small part of that, but it will make a huge difference to thousands of lives including, I believe, 600,000 who suffer from epilepsy. The simple reality is that the big tech companies can do better and need to step up. I have always said that we do not actually need the Bill or these amendments; we need the tech companies to do what they are supposed to do, and go out and regulate their consumer product. I have always strongly believed that.

During my time on the Committee I learned that we must follow the money—that is what it is all about for the tech companies. We have been listening to horrific stories from grieving parents, some of whom I met briefly, and from those who suffered at the hands of racism, abuse, threats—the list is endless. The tech companies could stop that now. They do not need the Bill to do it and they should do the right thing. We should not have to get the Bill on to the statute books to enforce what those companies should be doing in the first place. We keep saying that this issue has been going on for five years. The tech companies know that this has been talked about for five years, so why are they not doing something? For me the Bill is for all those grieving families who have lost their beautiful children, those who have been at the mercy of keyboard warriors, and those who have received harm or lost their lives because the tech companies have not, but could have, done better. This is about accountability. Where are the tech companies?

I wish to touch briefly on bereaved parents whose children have been at the mercy of technology and content. Many families have spent years and years still unable to understand their child’s death. We must consider imposing transparency on the tech companies. Those families cannot get their children back, but they are working hard to ensure that others do not lose theirs. Data should be given to coroners in the event of the death of a child to understand the circumstances. This is important to ensure there is a swift and humane process for the coroner to access information where there is reason to suspect that it has impacted on a child’s death.

In conclusion, a huge hurrah that we have new clause 53, and I thank the Government for this ground-breaking Bill. An even bigger hurrah to Zach, Zach’s mum, and the brilliant Epilepsy Society, and, of course, to Zach’s law, which is new clause 53.

Jamie Stone Portrait Jamie Stone
- View Speech - Hansard - - - Excerpts

Clearly I am on my feet now because I am the Liberal Democrat DCMS spokesman, but many is the time when, in this place, I have probably erred on the side of painting a rosy picture of my part of the world—the highlands—where children can play among the heather and enjoy themselves, and life is safe and easy. This week just gone I was pulled up short by two mothers I know who knew all about today. They asked whether I would be speaking. They told me of their deep concern for a youngster who is being bullied right now, to the point where she was overheard saying among her family that she doubted she would ever make the age of 21. I hope to God that that young person, who I cannot name, is reached out to before we reach the tragic level of what we have heard about already today. Something like that doesn’t half put a shadow in front of the sun, and a cold hand on one’s heart. That is why we are here today: we are all singing off the same sheet.

The Liberal Democrats back new clause 17 in the name of the right hon. Member for Barking (Dame Margaret Hodge). Fundamental to being British is a sense of fair play, and a notion that the boss or bosses should carry the can at the end of the day. It should not be beyond the wit of man to do exactly what the right hon. Lady suggested, and nobble those who ultimately hold responsibility for some of this. We are pretty strong on that point.

Having said all that, there is good stuff in the Bill. Obviously, it has been held up by the Government—or Governments, plural—which is regrettable, but it is easy to be clever after the fact. There is much in the Bill, and hopefully the delay is behind us. It has been chaotic, but we are pleased with the direction in which we are heading at the moment.

I have three or four specific points. My party welcomes the move to expand existing offences on sharing intimate images of someone to include those that are created digitally, known as deep fakes. We also warmly welcome the move to create a new criminal offence of assisting or encouraging self-harm online, although I ask the Government for more detail on that as soon as possible. Thirdly, as others have mentioned, the proposed implementation of Zach’s law will make it illegal to post stuff that hits people with epilepsy.

If the pandemic taught me one thing, it was that “media-savvy” is not me. Without my young staff who helped me during that period, it would have been completely beyond my capability to Zoom three times in one week. Not everyone out there has the assistance of able young people, which I had, and I am very grateful for that. One point that I have made before is that we would like to see specific objectives—perhaps delivered by Ofcom as a specific duty—on getting more media savvy out there. I extol to the House the virtue of new clause 37, tabled by my hon. Friend the Member for Twickenham (Munira Wilson). The more online savvy we can get through training, the better.

At the end of the day, the Bill is well intentioned and, as we have heard, it is essential that it makes a real impact. In the case of the young person I mentioned who is in a dark place right now, we must get it going pretty dashed quick.

19:30
Natalie Elphicke Portrait Mrs Elphicke
- View Speech - Hansard - - - Excerpts

I rise to speak to new clause 55, which stands in my name. I am grateful to my many right hon. and hon. Friends who have supported it, both by putting their name to it and otherwise. I welcome the Minister and his engagement with the new clause and hope to hear from him further as we move through the debate.

The new clause seeks to create a new criminal offence of intentionally sharing a photograph or film that facilitates or promotes modern slavery or illegal immigration. Members may have wondered how so many people—more than 44,000 this year alone—know who to contact to cross the channel, how to go about it and how much it will cost. Like any business, people smuggling relies on word of mouth, a shopfront or digital location on the internet, and advertising. As I will set out, in this context advertising is done not through an advert in the local paper but by posting a video and photos online.

Nationalities who use the channel crossing routes are from an astonishing array of countries—from Eritrea and Vietnam to Iraq and Iran—but they all end up arriving on boats that leave from France. Since May 2022, there has been a massive increase in the number of Albanians crossing the channel in small boats. From May to September this year, Albanian nationals comprised 42% of small boat crossings, with more than 11,000 Albanians arriving by small boats, compared with 815 the entire previous year. It is little wonder that it is easy to find criminal gangs posting in Albanian on TikTok with videos showing cheery migrants with thumbs up scooting across the channel on dinghies and motoring into Britain with ease. Those videos have comments, which have been roughly translated as:

“At 8 o’clock the next departure, hurry to catch the road”;

“They passed again today! Get in touch today”;

“Get on the road today, serious escape within a day, not…a month in the forest like most”;

“The trips continue, contact us, we are the best and the fastest”;

and

“Every month, safe passage, hurry up”.

However, far from being safe, the small boat crossings are harmful, dangerous and connected with serious crime here in the UK, including modern slavery, the drugs trade and people trafficking.

With regard to the journey, there have been a number of deaths at sea. The Minister for Immigration recently stated that many people in processing centres

“present with severe burns that they have received through the combination of salty water and diesel fuel in the dinghies.”—[Official Report, 28 November 2022; Vol. 723, c. 683.]

That, of course, underlines why prevention, detection and interception of illegal entry is so important on our sea border. It also speaks to the harm and prevention of harm that my new clause seeks to address: to identify and disrupt the ability of those gangs to post on social media and put up photographs, thereby attracting new business, and communicate in relation to their illegal activity.

The National Crime Agency has identified links with the criminal drugs trade, modern slavery and other serious and violent crime. That is because illegal immigration and modern slavery offences do not just happen abroad. A criminal enterprise of this scale has a number of operators both here in the UK and abroad. That includes people here in the UK who pay for the transit of another. When they do, they do not generally have the good fortune of that other individual in mind. There are particular concerns about young people and unaccompanied children as well as people who find themselves in debt bondage in modern slavery.

That also includes people here in the UK who provide information, such as those TikTok videos, to a friend or contacts in a home country so that other people can make their own arrangements to travel. It includes people here in the UK who take photos of arrivals and post or message them to trigger success fees. Those fees are the evidence-based method of transacting in this illegal enterprise and are thought to be responsible for some of the most terrifying experiences of people making the crossing, including even a pregnant woman and others being forced into boats at gunpoint and knifepoint in poor weather when they did not want to go, and parents separated from their children at the water’s edge, with their children taken and threatened to coerce them into complying.

Last year, 27 people died in the channel in a single day, in the worst small boat incident to date. A newspaper report about those deaths contains comment about a young man who died whose name was Pirot. His friend said of the arrangements for the journey:

“Typically…the smugglers made deals with families at home. Sometimes they turned up at the camp in masks. The crossing costs about £3,000 per person, with cash demanded in full once their loved one had made it to Dover. One of the Iraqi Kurdish smugglers who arranged Pirot’s crossing has since deleted his Facebook page and WhatsApp account”.

TikTok, WhatsApp and Facebook have all been identified as platforms actively used by the people smugglers. Action is needed in the Bill’s remit to protect people from people smugglers and save lives in the channel. The new offence would ensure that people here in the UK who promote illegal immigration and modern slavery face a stronger deterrent and, for the first time, real criminal penalties for their misdeeds. It would make it harder for the people smugglers to sell their wares. It would help to protect people who would be exploited and put at risk by those criminal gangs. The risk to life and injury, the risk of modern slavery, and the risks of being swept into further crime, both abroad and here in the UK, are very real.

The new offence would be another in the toolbox to tackle illegal immigration and prevent modern slavery. I hope that when the Minister makes his remarks, he may consider further expansion of other provisions currently in the Bill but outside the scope of our discussions, such as the schedule 7 priority offences. New clause 55 would tackle the TikTok traffickers and help prevent people from risking their lives by taking these journeys across the English channel.

Carla Lockhart Portrait Carla Lockhart (Upper Bann) (DUP)
- View Speech - Hansard - - - Excerpts

I welcome the fact that we are here today to discuss the Bill. It has been a long haul, and we were often dubious as to whether we would see it progressing. The Government have done the right thing by progressing it, because ultimately, as each day passes, harm is being caused by the lack of regulation and enforcement. While some concerns have been addressed, many have not. To that end, this must be not the end but the beginning of a legislative framework that is fit for purpose; one that is agile and keeps up with the speed at which technology changes. For me, probably the biggest challenge for the House and the Government is not how we start but how we end on these issues.

Like many Members, I am quite conflicted when it comes to legal but harmful content. I know that is a debate for another day, but I will make one short point. I am aware of the concerns about free speech. As someone of faith, I am cognisant of the outrageous recent statement from the Crown Prosecution Service that it is “no longer appropriate” to quote certain parts of the Bible in public. I would have serious concerns about similar diktats and censorship being imposed by social media platforms on what are perfectly legitimate texts, and beliefs based on those texts. Of course, that is just one example, but it is a good example of why, because of the ongoing warfare of some on certain beliefs and opinions, it would be unwise to bestow such policing powers on social media outlets.

When the Bill was first introduced, I made it very clear that it needed to be robust in its protection of children. In the time remaining, I wish to address some of the amendments that would strengthen the Bill in that regard, as well as the enforcement provisions.

New clause 16 is a very important amendment. None of us would wish to endure the pain of a child or loved one self-harming. Sadly, we have all been moved by the very personal accounts from victims’ families of the pain inflicted by self-harm. We cannot fathom what is in the mind of those who place such content on the internet. The right hon. Member for Haltemprice and Howden (Mr Davis) and those co-signing the new clause have produced a very considered and comprehensive text, dealing with all the issues in terms of intent, degree of harm and so on, so I fully endorse and welcome new clause 16.

Likewise, new clauses 45 and 46 would further strengthen the legislation by protecting children from the sharing of an intimate image without consent. Unfortunately, I have sat face to face—as I am sure many in this House have—with those who have been impacted by such cruel use of social media. The pain and humiliation it imposes on the victim is significant. It can cause scars that last a lifetime. While the content can be removed, the impact cannot be removed from the mind of the victim.

Finally, I make mention of new clause 53. Over recent months I have engaged with campaigners who champion the rights and welfare of those with epilepsy. Those with this condition need to be safe on the internet from the very specific and callous motivation of those who target them because of their condition. We make this change knowing that such legislative protection will increase online protection. Special mention must once again go to young Zach, who has been the star in making this change. What an amazing campaign, one that says to society that no matter how young or old you are, you can bring about change in this House.

This is a milestone Bill. I believe it brings great progress in offering protections from online harm. I believe it can be further strengthened in areas such as pornography. We only have to think that the British Board of Film Classification found that children are coming across pornography online as young as seven, with 51% of 11 to 13-year-olds having seen pornography at some point. That is damaging people’s mental health and their perception of what a healthy relationship should look and feel like. Ultimately, the Bill does not go far enough on that issue. It will be interesting to see how the other place deals with the Bill and makes changes to it. The day of the internet being the wild west, lawless for young and old, must end. I commend the Bill to the House.

Vicky Ford Portrait Vicky Ford
- View Speech - Hansard - - - Excerpts

It is great that the Bill is back in this Chamber. I have worked on it for many years, as have many others, during my time on the Science and Technology Committee and the Women and Equalities Committee, and as Children’s Minister. I just want to make three points.

First, I want to put on the record my support for the amendments tabled by my right hon. Friend the Member for Basingstoke (Dame Maria Miller). She is a true, right and honourable friend of women and girls all across the country. It is vital that women and girls are protected from intimate image abuse, from perverse and extreme pornography, and from controlling and coercive behaviour, as well as that we make a new offence to criminalise cyber-flashing.

Secondly, I want to talk about new clause 16 and self-harm, especially in relation to eating disorders. As I said in this place on Thursday, it is terrifying how many young people are suffering from anorexia today. The charity Beat estimates that 1.25 million people are suffering from eating disorders. A quarter of them are men; most are women. It also reminds us that anorexia is the biggest killer of all mental illnesses.

It is very hard to talk about one’s own experiences of mental illness. It brings back all the horrors. It makes people judge you differently. And you fear that people will become prejudiced against you. I buried my own experiences for nearly 40 years, but when I did speak out, I was contacted by so many sufferers and families, thanking me for having done so and saying it had brought them hope.

19:44
There may be many reasons why we have an increase in eating disorders, and I am sure that lockdown and the fears of the pandemic are a part of it, but I do remember from my own experience of anorexia 40 years ago how I had got it into my head that only by being ultra-thin could I be beautiful or valued. That is why images that glamorise self-harm, images that glamorise eating disorders, are so damaging. So it is really concerning to hear in recent surveys that more than one in four children have seen content about anorexia online. It is great that Ministers have promised that all children will be protected from self-harm, including eating disorders. When it comes to adults, however, I understand that Ministers may be considering an amendment similar to new clause 16 that would make it illegal to encourage self-harm online, but that it might not cover eating disorders, because they are just considering giving adults the right to opt out of seeing such content.
I was lucky that by the time I turned 18 years old I was over the worst of my anorexia, but when I look back at my teenage self, had I been 18 at the peak of my illness and had access to social media, I do not think I would have opted out of that content; I think I might have sought it out. It is incredibly important that the definition of self-harm absolutely recognises that eating disorders are a form of self-harm and are a killer.
My third point is that I welcome the measures to protect children from sexual abuse online and join my voice with all those who have thanked the Internet Watch Foundation. I have been honoured to be a champion of the foundation for over a decade. The work it does is so important and so brave. The Everyone’s Invited movement exposed the epidemic of sexual violence being suffered by young women and girls in our schools. As Children’s Minister at the time, I listened to their campaigners and learned from them how online pornography normalises sexual violence. There must be measures to prevent children from accessing all online porn. I was worried that Barnardo’s contacted me recently saying that more needs to be done to address the content that sexualises children in pornography. I hope the Minister will work closely with all children’s charities, including the wonderful Children’s Commissioner, as the Bill goes through the rest of its stages.
Jim Shannon Portrait Jim Shannon
- View Speech - Hansard - - - Excerpts

It is a pleasure to speak in the debate. I thank Members who have spoken thus far for their comments. I commend the right hon. Member for Chelmsford (Vicky Ford) for what she referred to in relation to eating disorders. At this time, we are very aware of that pertinent issue: the impact that social media has—the social pressure and the peer pressure—on those who feel they are too fat when they are not, or that they are carrying weight when they are not. That is part of what the Bill tries to address. I thank the Minister for his very constructive comments—he is always constructive—and for laying out where we are. Some of us perhaps have concerns that the Bill does not go far enough. I know I am one of them and maybe Minister, you might be of the same mind yourself—

Jim Shannon Portrait Jim Shannon
- Hansard - - - Excerpts

The Minister might be of the same mind himself.

Through speaking in these debates, my office has seen an increase in correspondence from parents who are thankful that these difficult issues are being talked about. The world is changing and progressing, and if we are going to live in a world where we want to protect our children and our grandchildren—I have six grandchildren —and all other grandchildren who are involved in social media, the least we can do is make sure they are safe.

I commend the hon. Member for Batley and Spen (Kim Leadbeater) and others, including the hon. Member for Watford (Dean Russell), who have spoken about Zach’s law. We are all greatly impressed that we have that in the Bill through constructive lobbying. New clause 28, which the hon. Member for Rotherham (Sarah Champion) referred to, relates to advocacy for young people. That is an interesting idea, but I feel that advocacy should be for the parents first and not necessarily young people.

Ahead of the debate, I was in contact with the Royal College of Psychiatrists. It published a report entitled “Technology use and the mental health of children and young people”—new clause 16 is related to that—which was an overview of research into the use of screen time and social media by children and young teenagers. It has been concluded that excessive use of phones and social media by a young person is detrimental to their development and mental health—as we all know and as Members have spoken about—and furthermore that online abuse and bullying has become more prevalent because of that. The right hon. Member for Witham (Priti Patel) referred to those who are susceptible to online harm. We meet them every day, and parents tell me that our concerns are real.

A recent report by NHS Digital found that one in eight 11 to 16-year-olds reported that they had been bullied online. When parents contact me, they say that bulling online is a key issue for them, and the statistics come from those who choose to be honest and talk about it. Although the Government’s role is to create a Bill that enables protection for our children, there is also an incredible role for schools, which can address bullying. My hon. Friend the Member for Upper Bann (Carla Lockhart) and I talked about some of the young people we know at school who have been bullied online. Schools have stepped in and stopped that, encouraging and protecting children, and they can play that role as well.

We have all read of the story of Molly Russell, who was only 14 years old when she took her life. Nobody in this House or outside it could not have been moved by her story. Her father stated that he strongly believed that the images, videos and information that she was able to access through Instagram played a crucial part in her life being cut short. The Bill must complete its passage and focus on strengthening protections online for children. Ultimately, the responsibility is on large social media companies to ensure that harmful information is removed, but the Bill puts the onus on us to hold social media firms to account and to ensure that they do so.

Harmful and dangerous content for children comes in many forms—namely, online abuse and exposure to self-harm and suicidal images. In addition, any inappropriate or sexual content has the potential to put children and young people at severe risk. The Bill is set to put provisions in place to protect victims in the sharing of nude or intimate photos. That is increasingly important for young people, who are potentially being groomed online and do not understand the full extent of what they are doing and the risks that come with that. Amendments have been tabled to ensure that, should such cases of photo sharing go to court, provisions are in place to ensure complete anonymity for the victims—for example, through video links in court, and so on.

I commend the right hon. Member for Basingstoke (Dame Maria Miller), who is not in her place, for her hard work in bringing forward new clause 48. Northern Ireland, along with England and Wales, will benefit from new clause 53, and I welcome the ability to hand down sentences of between six months and potentially five years.

Almost a quarter of girls who have taken a naked image have had their image sent to someone else online without their permission. Girls face very distinct and increased risks on social media, with more than four in five online grooming crimes targeting girls, and 97% of child abuse material featuring the sexual abuse of girls—wow, we really need to do something to protect our children and to give parents hope. There needs to be increased emphasis and focus on making children’s use of the internet safer by design. Once established, all platforms and services need to have the capacity and capability to respond to emerging patterns of sexual abuse, which often stem from photo sharing.

The Minister referred to terrorism and how terrorism can be promoted online. I intervened on him to mention the glorification of IRA terrorism and how that encourages further acts of terrorism and people who are susceptible to be involved. I am quite encouraged by the Minister’s response, and I think that we need to take a significant step. Some in Northern Ireland, for instance, try to rewrite history and use the glorification of terrorism for that purpose. We would like to see strengthening of measures to ensure that those involved in those acts across Northern Ireland are controlled.

In conclusion, there are many aspects of the Bill that I can speak in support of in relation to the benefits of securing digital protections for those on social media. This is, of course, about protecting not just children, but all of us from the dangers of social media. I have chosen to speak on these issues as they are often raised by constituents. There are serious matters regarding the glorification and encouragement of self-harm that the Bill needs to address. We have heard stories tonight that are difficult to listen to, because they are true stories from people we know, and we have heard horror stories about intimate photo sharing online. I hope that action on those issues, along with the many others that the Government are addressing, will be embedded in the Bill with the intent to finally ensure that we have regulations and protection for all people, especially our children—I think of my children and grandchildren, and like everybody else, my constituents.

Dean Russell Portrait Dean Russell
- View Speech - Hansard - - - Excerpts

I welcome the Minister to his place; I know that he will be excellent in this role, and it is incredible that he is so across the detail in such a short time.

I will primarily talk about new clause 53—that may not be that surprising, given how often it has been spoken about today—which is, ultimately, about Zach’s law. Zach is a truly heroic figure, as has been said. He is a young child with cerebral palsy, autism and epilepsy who was cruelly trolled by sick individuals who sent flashing images purposely to cause seizures and cause him damage. That was not unique to Zach, sadly; it happened to many people across the internet and social media. When somebody announced that they were looking for support, having been diagnosed with epilepsy, others would purposely identify that and target the person with flashing images to trigger seizures. That is absolutely despicable.

My hon. Friend the Member for Stourbridge (Suzanne Webb) has been my partner in crime—or in stopping the crime—over the past two years, and this has been a passion for us. Somebody said to me recently that we should perhaps do our victory lap in the Chamber today for the work that has been done to change the law, but Zach is the person who will get to go around and do that, as he did when he raised funds after he was first cruelly trolled.

My hon. Friend the Member for Folkestone and Hythe (Damian Collins) also deserves an awful lot of praise. My hon. Friend the Member for Stourbridge and I worked with him on the Joint Committee on the draft Online Safety Bill this time last year. It was incredible to work with Members of both Houses to look at how we can make the Bill better. I am pleased about the response to so many measures that we put forward, including the fact that we felt that the phrase “legal but harmful” created too many grey areas that would not catch the people who were doing these awful—what I often consider to be—crimes online to cause harm.

I want to highlight some of what has been done over the past two years to get Zach’s law to this point. If I ever write a memoir, I am sure that my diaries will not be as controversial as some in the bookshops today, but I would like to dedicate a chapter to Zach’s law, because it has shown the power of one individual, Zach, to change things through the democratic process in this House, to change the law for the entire country and to protect people who are vulnerable.

Not only was Zach’s case raised in the Joint Committee’s discussions, but afterwards my hon. Friend the Member for Stourbridge and I managed to get all the tech companies together on Zoom—most people will probably not be aware of this—to look at making technical changes to stop flashing images being sent to people. There were lots of warm words: lots of effort was supposedly put in so that we would not need a law to stop flashing images. We had Giphy, Facebook, Google, Twitter—all these billion-pound platforms that can do anything they want, yet they could not stop flashing images being sent to vulnerable people. I am sorry, but that is not the work of people who really want to make a difference. That is people who want to put profit over pain—people who want to ensure that they look after themselves before they look after the most vulnerable.

20:00
That is why the Bill is so important: because if the platforms will not do the right thing, we will. Hon. Members may disagree on some of the detail in the Bill, but most of that detail is there to stop the platforms doing the wrong thing. We should not have to force them into it, but we have come to the point where we will. I am sure that the measures in the Bill will go further than they would ever have wanted.
To repeat a phrase that I have used before in this Chamber, Andy Warhol used to talk about the era of 15 minutes of fame, but sadly through social media we now have 15 minutes of shame. People are hate-mobbed because they have a different point of view, or have images shared that they do not want shared, purely to cause them distress. The Bill will help to stop most of that.
As my hon. Friend the Member for Stourbridge says, a key issue is chasing the money. The truth is that a lot of online content is addictive, especially to young kids who scroll throughout the night, watching the next TikTok or reading the next message or the next post. They are trying to see the next piece of content that will give them some enjoyment or connect them to the real world. The platforms have put the “ad” into “addiction” and have caused harm by doing so, making profits they should not have made from the harm that they have done to children.
Ultimately, this debate is about making sure that the Bill is fit for purpose. I totally understand that many hon. Members across the Chamber want lots of changes and additions to it, but as we are coming up to Christmas, perhaps I can use a suitable analogy. We do not want a Christmas tree Bill with so many baubles of new legislation hanging from it that we do not achieve our ultimate goal, which is to protect.
Suzanne Webb Portrait Suzanne Webb
- Hansard - - - Excerpts

Talking of Christmas, would not the best Christmas present for lovely Zach be to enshrine new clause 53, that amazing amendment, as Zach’s law? Somehow we should formalise it as Zach’s law—that would be a brilliant Christmas present.

Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

I wholeheartedly agree. Zach, if you are listening right now, you are an absolute hero—you have changed so much for so many people. Without your effort, this would not be happening today. In future, we can look back on this and say, “You know what? Democracy does work.”

I thank all hon. Members for their campaigning work to raise Zach’s law in the public consciousness. It even reached the US. I am sure many hon. Members dance along to Beyoncé of an evening or listen to her in the car when they are bopping home; a few months ago she changed one of her YouTube videos, which had flashing images in it, because the Epilepsy Society reached out to describe the dangers that it would cause. These campaigns work. They are about public awareness and about changing the law. We talk about the 15 minutes of shame that people face on social media, but ultimately the shame is on the platforms for forcing us to legislate to make them do the right thing.

I will end with one small point. The internet has evolved; the world wide web has evolved; social media is evolving; the metaverse, 3D virtual reality worlds and augmented reality are changing. I urge the Government or the House to look at creating a Committee specifically on the Bill. I know that there are lots of arguments that it should be a Sub-Committee of the Digital, Culture, Media and Sport Committee, but the truth is that the online world is changing dramatically. We cannot take snapshots every six months, every year or every two years and assume that they will pick up on all the changes happening in the world.

As the hon. Member for Pontypridd (Alex Davies-Jones) said, TikTok did not even exist when the Bill was first discussed. We now have an opportunity to ask what is coming next, keep pace with it and put ethics and morality at the heart of the Bill to ensure that it is fit for purpose for many decades to come. I thank the Minister for his fantastic work; my partner in crime, my hon. Friend the Member for Stourbridge, for her incredible work; and all Members across the House. Please, please, let us get this through tonight.

Laura Farris Portrait Laura Farris (Newbury) (Con)
- View Speech - Hansard - - - Excerpts

It is a privilege to follow my hon. Friend the Member for Watford (Dean Russell) and so many hon. Members who have made thoughtful contributions. I will confine my comments to the intersection of new clauses 28 and 45 to 50 with the impact of online pornography on children in this country.

There has been no other time in the history of humanity when we have exposed children to the violent, abusive, sexually explicit material that they currently encounter online. In 2008, only 14% of children under 13 had seen pornography; three years later, that figure had risen to 49%, correlating with the rise in children owning smartphones. Online pornography has a uniquely pernicious impact on children. For very young children, there is an impact just from seeing the content. For older teenagers, there is an impact on their behaviour.

We are seeing more and more evidence of boys exhibiting sexually aggressive behaviour, with actions such as strangulation, which we have dealt with separately in this House, and misogynistic attitudes. Young girls are being conditioned into thinking that their value depends on being submissive or objectified. That is leading children down a pathway that leads to serious sexual offending by children against children. Overwhelmingly, the victims are young girls.

Hon. Members need not take my word for it: after Everyone’s Invited began documenting the nature and extent of the sexual experiences happening in our schools, an Ofsted review revealed that the most prevalent victims of serious sexual assaults among the under-25s are girls aged 15 to 17. In a recent publication in anticipation of the Bill, the Children’s Commissioner cited the example of a teenage boy arrested for his part in the gang rape of a 14-year old girl. In his witness statement to the police, the boy said that it felt just like a porn film.

Dr John Foubert, the former White House adviser on rape prevention, has said:

“It wasn’t until 10 years ago when I came to the realization that the secret ingredient in the recipe for rape was not secret at all…That ingredient…is today’s high speed Internet pornography.”

The same view has been expressed, in one form or another, by the chief medical officers for England and for Wales, the Independent Inquiry into Child Sexual Abuse, the Government Equalities Office, the Children’s Commissioner, Ofsted and successive Ministers.

New clause 28 requests an advocacy body to represent and protect the interests of child users. I welcome the principle behind the new clause. I anticipate that the Minister will say that he is already halfway there by making the Children’s Commissioner a statutory consultee to Ofcom, along with the Domestic Abuse Commissioner and others who have been named in this debate. However, whatever the Government make of the Opposition’s new clause, they must surely agree that it alights on one important point: the online terrain in respect of child protection is evolving very fast.

By the time the Bill reaches the statute book, new providers will have popped up again. With them will come unforeseen problems. When the Bill was first introduced, TikTok did not exist, as my hon. Friend the Member for Watford said a moment ago, and neither did OnlyFans. That is precisely the kind of user-generated site that is likely to try and dodge its obligations to keep children safe from harm, partly because it probably does not even accept that it exposes them to harm: it relies on the fallacy that the user is in control, and operates an exploitative business model predicated on that false premise.

I think it important for someone to represent the issue of child protection on a regular basis because of the issue of age verification, which we have canvassed, quite lightly, during the debate. Members on both sides of the House have pointed out that the current system which allows children to self-certify their date of birth is hopelessly out of date. I know that Ministers envisage something much more ambitious with the Bill’s age assurance and age verification requirements, including facial recognition technology, but I think it is worth our having a constant voice reporting on the adequacy of whatever age assurance steps internet providers may take, because we know how skilful children can be in navigating the internet. We know that there are those who have the technological skills to IP shroud or to use VPN. I also think it important for there to be a voice to maintain the pressure on the Government—which is what I myself want to do tonight—for an official Government inquiry into pornography harms, akin to the one on gambling harms that was undertaken in 2019. That inquiry was extremely important in identifying all the harm that was caused by gambling. The conclusions of an equivalent inquiry into pornography would leave no wriggle room for user-generated services to deny the risk of harm.

My right hon. Friend the Member for Basingstoke (Dame Maria Miller) pointed out, very sensibly, that her new clauses 45 to 50 build on all the Law Commission’s recommendations. It elides with so much work that has already been done in the House. We have produced, for instance, the Domestic Abuse Act 2021, which dealt with revenge porn, whether threatened or actual and whether genuine or fake, and with coercive control. Many Members recognise what was achieved by all our work a couple of years ago. However, given the indication from Ministers that they are minded to accept the new clauses in one form or another, I should like them to explain to the House how they think the Bill will capture the issue of sexting, if, indeed, it will capture that issue at all.

As the Minister will know, sexting means the exchanging of intimate images by, typically, children, sometimes on a nominally consensual basis. Everything I have read about it seems to say, “Yes, prima facie this is an unlawful act, but no, we do not seek to criminalise children, because we recognise that they make errors of judgment.” However, while I agree that it may be proportionate not to criminalise children for doing this, it remains the case that when an image is sent with the nominal consent of the child—it is nearly always a girl—it is often a product of duress, the image is often circulated much more widely than the recipient, and that often has devastating personal consequences for the young girl involved. All the main internet providers now have technology that can identify a nude image. It would be possible to require them to prevent nude images from being shared when, because of extended age-verification abilities, they know that the user is a child. If the Government are indeed minded to accept new clauses 45 to 50, I should like them to address that specific issue of sexting rather than letting it fall by the wayside as something separate, or outside the ambit of the Bill.

Nigel Evans Portrait Mr Deputy Speaker (Mr Nigel Evans)
- Hansard - - - Excerpts

The last Back-Bench speaker is Miriam Cates.

Miriam Cates Portrait Miriam Cates (Penistone and Stocksbridge) (Con)
- View Speech - Hansard - - - Excerpts

Thank you, Mr Deputy Speaker. I think you are the third person to take the Chair during the debate. It is an honour to follow my hon. Friend the Member for Newbury (Laura Farris); I agree with everything that she said, and my comments will be similar.

This has been a long but fascinating debate. We have discussed only a small part of the Bill today, and just a few amendments, but the wide range of the debate reflects the enormous complexity of what the Bill is intended to do, which is to regulate the online world so that it is subject to rules, regulations, obligations and protective measures equivalent to those in the offline world. We must do this, because the internet is now an essential part of our infrastructure. I think that we see the costs of our high-speed broadband as being in the same category as our energy and water costs, because we could not live without it. Like all essential infrastructure, the internet must be regulated. We must ensure that providers are working in the best interests of consumers, within the law and with democratic accountability.

Regulating the internet through the Bill is not a one-off project. As many Members have said, it will take years to get it right, but we must begin now. I think the process can be compared with the regulation of roads. A century ago there were hardly any private motor cars on the roads. There were no rules; people did not even have to drive on a particular side of the road. There have been more than 100 years of frequent changes to rules and regulations to get it right. It seems crazy now to think there was a time when there were no speed limits and no seat belts. The death rates on the roads, even in the 1940s, were 13 times higher than they are now. Over time, however, with regulation, we have more or less solved the complex problems of road regulation. Similarly, it will take time to get this Bill right, but we must get it on to the statute book and give it time to evolve.

20:15
The crucial point, though, is that we must look at the internet through a child’s eyes. I thoroughly support the sentiment embodied in new clause 28, which, as my hon. Friend said, calls for the establishment of an advocacy body to represent child users of the internet. The internet has many impacts on adults. Some are good—I love Google Maps; I will never get lost again—and some are bad, but the internet has utterly transformed childhood. Some would say that it has destroyed childhood. Childhood is a crucial and irreplaceable time, and before the internet parents, schools and communities had full control over who influenced their children. People did not let others into their home unless they trusted them, and knew that they had the best interests and the welfare of their children at heart. Now, the number of people who are influencing our children in their bedrooms, often malevolently, is off the scale. It is hard to comprehend the impact and the influence that the internet has had on children, and a large number of those providers do not have their best interests at heart.
We have heard a great many tragic stories today about children who have been harmed through other people’s direct access to their lives over mobile phones, but, as my hon. Friend said, one of the overriding results of the internet is the sexualisation of children in a truly destructive way. As my hon. Friend also said, about 50% of 12-year-olds have now seen online pornography, and 1.4 million UK children access porn every month. There is nothing mainstream about this pornography. It is not the same as the dodgy magazines of old. Violence, degrading behaviour, abuse and addiction are all mainstream on pornography sites now.
Dean Russell Portrait Dean Russell
- Hansard - - - Excerpts

Does my hon. Friend agree that the work of charities such as Dignify in Watford, where Helen Roberts does incredible work in raising awareness of this issue, is essential to ensuring that people are aware of the harm that can be done?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I completely agree. Other charities, such as CEASE—the Centre to End All Sexual Exploitation —and Barnardo’s have been mentioned in the debate, and I think it so important to raise awareness. There are many harms in the internet, but pornography is an epidemic. It makes up a third of the material on the internet, and its impact on children cannot be overstated. Many boys who watch porn say that it gives them ideas about the kind of sex that they want to try. It is not surprising that a third of child sexual abuse is committed by other children. During puberty—that very important period of development—boys in particular are subject to an erotic imprint. The kind of sex that they see and the sexual ideas that they have during that time determine what they see as normal behaviour for the rest of their lives. It is crucial for children to be protected from harmful pornography that encourages the objectification and abuse of—almost always—women.

Neale Hanvey Portrait Neale Hanvey
- Hansard - - - Excerpts

I thank—in this context—my hon. Friend for giving way.

The lawsuits are coming. There can certainly be no more harmful act than encouraging a young person to mutilate their body with so-called gender-affirming surgery with no therapeutic intervention beforehand. In Scotland, the United Nations special rapporteur for violence against women and girls has criticised the Scottish Government’s Gender Recognition Reform (Scotland) Bill. Does the hon. Lady agree that it is time to establish who is a feminist, and who is a fake to their fingertips?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I thank the hon. Gentleman for his intervention. He is absolutely right: inciting a child to harm their body, whatever that harm is, should be criminalised, and I support the sentiment of new clause 16, which seeks to do that. Sadly, lots of children, particularly girls, go online and type in “I don’t like my body”. Maybe they are drawn to eating disorder sites, as my right hon. Friend the Member for Chelmsford (Vicky Ford) has mentioned, but often they are drawn into sites that glorify transition, often with adult men that they do not even know in other countries posting pictures of double mastectomies on teenage girls.

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

The hon. Lady must realise that this is fantasy land. It is incredibly difficult to get gender reassignment surgery. The “they’re just confused” stuff is exactly what was said to me as a young gay man. She must realise that this really simplifies a complicated issue and patronises people going through difficult choices.

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I really wish it was fantasy land, but I am in contact with parents each and every day who tell me stories of their children being drawn into this. Yes, in this country it is thankfully very difficult to get a double mastectomy when you are under 18, but it is incredibly easy to buy testosterone illegally online and to inject it, egged on by adults in other countries. Once a girl has injected testosterone during puberty, she will have a deep voice and facial hair for life and male-pattern baldness, and she will be infertile. That is a permanent change, it is self-harm and it should be criminalised under this Bill, whether through this clause or through the Government’s new plans. The hon. Member for Kirkcaldy and Cowdenbeath (Neale Hanvey) is absolutely right: this is happening every day and it should be classed as self-harm.

Going back to my comments about the effect on children of viewing pornography, I absolutely support the idea of putting children’s experience at the heart of the Bill but it needs to be about children’s welfare and not about what children want. One impact of the internet has been to blur the boundary between adults and children. As adults, we need to be able to say, “This is the evidence of what is harmful to children, and this is what children should not be seeing.” Of course children will say that they want free access to all content, just like they want unlimited sweets and unlimited chocolate, but as adults we need to be able to say what is harmful for children and to protect them from seeing it.

This bring me to Government new clause 11, which deals with making sure that child sexual abuse material is taken offline. There is a clear link between the epidemic of pornography and the epidemic of child sexual abuse material. The way the algorithms on porn sites work is to draw users deeper and deeper into more and more extreme content—other Members have mentioned this in relation to other areas of the internet—so someone might go on to what they think is a mainstream pornography site and be drawn into more and more explicit, extreme and violent criminal pornography. At the end of this, normal people are drawn into watching children being abused, often in real time and often in other countries. There is a clear link between the epidemic of porn and the child sexual abuse material that is so prevalent online.

Last week in the Home Affairs Committee we heard from Professor Alexis Jay, who led the independent inquiry into child sexual abuse. Her report is harrowing, and it has been written over seven years. Sadly, its conclusion is that seven years later, there are now even more opportunities for people to abuse children because of the internet, so making sure that providers have a duty to remove any child sexual abuse material that they find is crucial. Many Members have referred to the Internet Watch Foundation. One incredibly terrifying statistic is that in 2021, the IWF removed 252,194 web pages containing child sexual abuse material and an unknown number of images. New clause 11 is really important, because it would put the onus on the tech platforms to remove those images when they are found.

It is right to put the onus on the tech companies. All the way through the writing of this Bill, at all the consultation meetings we have been to, we have heard the tech companies say, “It’s too hard; it’s not possible because of privacy, data, security and cost.” I am sure that is what the mine owners said in the 19th century when they were told by the Government to stop sending children down the mines. It is not good enough. These are the richest, most powerful companies in the world. They are more powerful than an awful lot of countries, yet they have no democratic accountability. If they can employ real-time facial recognition at airports, they can find a way to remove child abuse images from the internet.

This leads me on to new clause 17, tabled by the right hon. Member for Barking (Dame Margaret Hodge), which would introduce individual director liability for non-compliance. I completely support that sentiment and I agree that this is likely to be the only way we will inject some urgency into the process of compliance. Why should directors who are profiting from the platforms not be responsible if children suffer harm as a result of using their products? That is certainly the case in many other industries. The right hon. Lady used the example of the building trade. Of course there will always be accidents, but if individual directors face the prospect of personal liability, they will act to address the systemic issues, the problems with the processes and the malevolent algorithms that deliberately draw users towards harm.

William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

My hon. Friend knows that I too take a great interest in this, and I am glad that the Government have agreed to continue discussions on this question. Is she aware that the personal criminal liability for directors flows from the corporate criminal liability in the company of which they are a director, and that their link to the criminal act itself, even if the company has not been or is not being prosecuted, means that the matter has to be made clear in the legislation, so that we do not have any uncertainty about the relationship of the company director and the company of which he is a director?

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I was not aware of that, but I am now. I thank my hon. Friend for that information. This is a crucial point. We need the accountability of the named director associated with the company, the platform and the product in order to introduce the necessary accountability. I do not know whether the Minister will accept this new clause today, but I very much hope that we will look further at how we can make this possible, perhaps in another place.

I very much support the Bill. We need to get it on the statute book, although it will probably need further work, and I support the Government amendments. However, given the link between children viewing pornography and child sexual abuse, I hope that when the Bill goes through the other place, their lordships will consider how regulations around pornographic content can be strengthened, in order to drastically reduce the number of children viewing porn and eventually being drawn into criminal activities themselves. In particular, I would like their lordships to look at tightening and accelerating the age verification and giving equal treatment to all pornography, whether it is on a porn site or a user-to-user service and whether it is online or offline. Porn is harmful to children in whatever form it comes, so the liability on directors and the criminality must be exactly the same. I support the Bill and the amendments in the Government’s name, but it needs to go further when it goes to the other place.

Paul Scully Portrait Paul Scully
- View Speech - Hansard - - - Excerpts

I thank Members for their contributions during today’s debate and for their ongoing engagement with such a crucial piece of legislation. I will try to respond to as many of the issues raised as possible.

My right hon. Friend the Member for Haltemprice and Howden (Mr Davis), who is not in his place, proposed adding in promoting self-harm as a criminal offence. The Government are sympathetic to the intention behind that proposal; indeed, we asked the Law Commission to consider how the criminal law might address that, and have agreed in principle to create a new offence of encouraging or assisting serious self-harm. The form of the offence recommended by the Law Commission is based on the broadly comparable offence of encouraging or assisting suicide. Like that offence, it covers the encouragement of, or assisting in, self-harm by means of communication and in other ways. When a similar amendment was tabled by the hon. Members for Ochil and South Perthshire (John Nicolson) and for Aberdeen North (Kirsty Blackman) in Committee, limiting the offence to encouragement or assistance by means of sending a message, the then Minister, my right hon. Friend the Member for Croydon South, said it would give only partial effect to the Law Commission’s recommendation. It remains the Government’s intention to give full effect to the Law Commission’s recommend-ations in due course.

20:43
I recognise the strong cross-party support for the amendment and the terrible damage done by online communications that encourage self-harm. The Molly Russell case has been mentioned by many Members today, and I send my condolences to Mr Russell, who was here earlier—I welcomed him to the Gallery—to listen to the early parts of this debate, along with other people who have suffered in a similar fashion. That case illustrates all too clearly that we have to do much more to protect young people like Molly from such harmful content. As we signalled in a written ministerial statement on 29 November, the Government intend to introduce in the Lords a new communications offence of encouraging self-harm.
My right hon. Friend the Member for Chelmsford (Vicky Ford) spoke so powerfully and movingly. She bared her soul in her personal testimony, having covered this deep inside herself for four decades. For her to come in front of us, in public, and give her testimony, all I can say is thank you. I commit to working with her to see what more we can do to ensure that eating disorders are captured in legislation as best we can. This will clearly be for children, but we want to see what more we can do for everyone and to protect the most vulnerable.
New clause 16, tabled by my right hon. Friend the Member for Haltemprice and Howden, would add a communications offence of encouraging or assisting self-harm to the Suicide Act 1961. I recognise the link between self-harm and suicide, but the two are distinct. The 1961 Act is about encouraging or assisting suicide, not self-harm, so any offence covering the latter should be separate from that Act. I would like to have a chat with him about the drafting of proposed new section 3A(8)(c), as I do not follow its logic and would like to test it a little more. For those reasons, I hope he will agree not to press his amendment and to allow the Government to move an amendment in the Lords.
New clause 17 would enable Ofcom to use enforcement sanctions directly against senior managers if their actions, directions, negligence or consent cause a service’s failure to comply with any of the enforceable requirements. It is vital that senior executives take their new responsibilities seriously. Under the Bill, Ofcom will be able to hold senior tech executives criminally liable if they fail to ensure that their company provides Ofcom with the information it needs to regulate effectively.
The existing provisions have been carefully designed to ensure that tech executives take personal responsibility for ensuring compliance with the framework, while ensuring sufficient legal clarity on what amounts to an offence and who can be prosecuted. The senior management liability is targeted specifically at the obligations to ensure that Ofcom is provided with the information it needs to regulate, as this is essential to the effective functioning of the regime. This approach is similar to the regulation of a number of other sectors, such as telecommunications.
New clause 17 would make senior managers personally liable, far beyond the current proposals, for the actions of the entities for which they work. The framework establishes a range of enforcement requirements, and a regulated service is the proper legal entity to be liable for failures to comply with those requirements. It would not be appropriate to extend that liability to any director or manager of a regulated service.
The Government do not believe it would be proportionate or effective to expand the scope of individual liability under this Bill, for a number of reasons. There is a real risk of damaging the UK’s attractiveness as a place to start and grow a digital business. It might also lead to unintended consequences, such as tech executives driving an over-zealous approach to content take-down, for fear of going to prison for a regulatory failing.
William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

I have raised this on a number of occasions in the past few hours, as have my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) and the right hon. Member for Barking (Dame Margaret Hodge). Will the Minister be good enough to ensure that this matter is thoroughly looked at and, furthermore, that the needed clarification is thought through?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I was going to come to my hon. Friend in two seconds.

In the absence of clearly defined offences, the changes we are making to the Bill mean that it is likely to be almost impossible to take enforcement action against individuals. We are confident that Ofcom will have all the tools necessary to drive the necessary culture change in the sector, from the boardroom down.

This is not the last stage of the Bill. It will be considered in Committee—assuming it is recommitted today—and will come back on Report and Third Reading before going to the House of Lords, so there is plenty of time further to discuss this and to give my hon. Friend the clarification he needs.

Margaret Hodge Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Is the Minister saying he is open to changing his view on why he is minded to reject new clause 17 tonight?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I do not think I am changing my view. I am saying that this is not the last stage of the Bill, so there will be plenty of opportunity further to test this, should Members want to do so.

On new clause 28, the Government recognise and agree with the intent behind this amendment to ensure that the interests of child users of regulated services are represented. Protecting children online is the top priority in this Bill, and its key measures will ensure that children are protected from harmful content. The Bill appoints a regulator with comprehensive powers to force tech companies to keep children safe online, and the Bill’s provisions will ensure that Ofcom will listen and respond to the needs of children when identifying priority areas for regulatory action, setting out guidance for companies, taking enforcement action and responding to super-complaints.

Right from the outset, Ofcom must ensure that its risk assessment and priorities reflect the needs of children. For example, Ofcom is required to undertake research that will help understand emerging risks to child safety. We have heard a lot today about the emerging risks with changing technology, and it is important that we keep on top of those and have that children’s voice at the heart of this. The Bill also expands the scope of the Communications Consumer Panel to online safety matters. That independent panel of experts ensures that user needs are at the heart of Ofcom’s regulatory approach. Ofcom will also have the flexibility to choose other mechanisms to better understand user experiences and emerging threats. For example, it may set up user panels or focus groups.

Importantly, Ofcom will have to engage with expert bodies representing children when developing codes of practice and other regulatory guidance. For example, Ofcom will be required to consult persons who represent the interests of children when developing its codes of practice. That means that Ofcom’s codes will be fully informed by how children behave online, how they experience harm and what impact the proposed measures will have on their online experience. The super-complaints process will further enable independent bodies advocating for children to have their voices heard, and will help Ofcom to recognise and eliminate systemic failures.

As we have heard, the Government also plan to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it develops its code of practice. That amendment will be tabled in the House of Lords. Through this consultation, the commissioner will be able to flag systemic issues or issues of particular importance to the regulator, helping Ofcom to target investigations and, if necessary, sanctions at matters that most affect children’s online experience.

As such, there are ample opportunities in the framework for children’s voices to be heard, and the Government are not convinced of the need to legislate for another child user advocacy body. There are plenty of bodies out there that Ofcom will already be reaching out to and there is an abundance of experience in committed representative groups that are already engaged and will be engaged with the online safety framework. They include the existing statutory body responsible for promoting the interests of children, the Children’s Commissioner. Adding an additional statutory body would duplicate existing provision, creating a confusing landscape, and that would not be in the best interests of children.

Sarah Champion Portrait Sarah Champion
- Hansard - - - Excerpts

I hear what the Minister is saying about creating a statutory body, but will he assure this House that there is a specific vehicle for children’s voices to be heard in this? I ask because most of us here are not facing the daily traumas and constant recreation of different apps and social media ways to reach out to children that our children are. So unless we have their voice heard, this Bill is not going to be robust enough.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

As I say, we are putting the Children’s Commissioner as a statutory consultee in the Bill. Ofcom will also have to have regard to all these other organisations, such as the 5Rights Foundation and the NSPCC, that are already there. It is in the legislation that Ofcom will have to have regard to those advocates, but we are not specifically suggesting that there should be a separate body duplicating that work. These organisations are already out there and Ofcom will have to reach out to them when coming up with its codes of practice.

We also heard from my hon. Friend the Member for Dover (Mrs Elphicke) about new clause 55. She spoke powerfully and I commend her for all the work she is doing to tackle the small boats problem, which is affecting so many people up and down this country. I will continue to work closely with her as the Bill continues its passage, ahead of its consideration in the Lords, to ensure that this legislation delivers the desired impact on the important issues of illegal immigration and modern slavery. The legislation will give our law enforcement agencies and the social media companies the powers and guidance they need to stop the promotion of organised criminal activity on social media. Clearly, we have to act.

My right hon. Friend the Member for Witham (Priti Patel), who brings to bear her experience as a former Home Secretary, spoke eloquently about the need to have joined-up government, to make sure that lots of bits of legislation and all Departments are working on this space. This is a really good example of joined-up government, where we have to join together.

Natalie Elphicke Portrait Mrs Elphicke
- Hansard - - - Excerpts

Will the Minister confirm that, in line with the discussions that have been had, the Government will look to bring back amendments, should they be needed, in line with new clause 55 and perhaps schedule 7, as the Bill goes to the Lords or returns for further consideration in this House?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

All that I can confirm is that we will work with my hon. Friend and with colleagues in the Home Office to make sure that this legislation works in the way that she intends.

We share with my right hon. Friend the Member for Basingstoke (Dame Maria Miller) the concern about the abuse of deep fake images and the need to tackle the sharing of intimate images where the intent is wider than that covered by current offences. We have committed to bring forward Government amendments in the Lords to do just that, and I look forward to working with her to ensure that, again, we get that part of the legislation exactly right.

We also recognise the intent behind my right hon. Friend’s amendment to provide funding for victim support groups via the penalties paid by entities for failing to comply with the regulatory requirements. Victim and survivor support organisations play a critical role in providing support and tools to help people rebuild their lives. That is why the Government continue to make record investments in this area, increasing the funding for victim and witness support services to £192 million a year by 2024-25. We want to allow the victim support service to provide consistency for victims requiring support.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank my hon. Friend for giving way and for his commitment to look at this matter before the Bill reaches the House of Lords. Can he just clarify to me that it is his intention to implement the Law Commission’s recommendations that are within the scope of the Bill prior to the Bill reaching the House of Lords? If that is the case, I am happy to withdraw my amendments.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I cannot confirm today at what stage we will legislate. We will continue to work with my right hon. Friend and the Treasury to ensure that we get this exactly right. We will, of course, give due consideration to the Law Commission’s recommendations.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Unless I am mistaken, no other stages of the Bill will come before the House where this can be discussed. Either it will be done or it will not. I had hoped that the Minister would answer in the affirmative.

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I understand. We are ahead of the Lords on publication, so yes is the answer.

I have two very quick points for my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright). He was right to speak about acting with humility. We will bring forward amendments for recommittal to amend the approach for category 1 designation—not just the smaller companies that he was talking about, but companies that are pushing that barrier to get to category 1. I very much get his view that the process could be delayed unduly, and we want to make sure that we do not get the unintended consequences that he describes. I look forward to working with him to get the changes to the Bill to work exactly as he describes.

Finally, let me go back to the point that my right hon. Friend the Member for Haltemprice and Howden made about encrypted communications. We are not talking about banning end-to-end encryption or about breaking encryption—for the reasons set out about open banking and other areas. The amendment would leave Ofcom powerless to protect thousands of children and could leave unregulated spaces online for offenders to act, and we cannot therefore accept that.

John McDonnell Portrait John McDonnell
- Hansard - - - Excerpts

Just briefly, because I know that the Minister is about to finish, can he respond on amendment 204 with regard to the protection of journalists?

Paul Scully Portrait Paul Scully
- Hansard - - - Excerpts

I am happy to continue talking to the right hon. Gentleman, but I believe that we have enough protections in the Bill, with the human touch that we have added after the automatic flagging up of inquiries. The NCA will also have to have due regard to protecting sources. I will continue to work with him on that.

I have not covered everybody’s points, but this has been a very productive debate. I thank everyone for their contributions. We are really keen to get the Bill on the books and to act quickly to ensure that we can make children as safe as possible online.

Question put and agreed to.

New clause 11 accordingly read a Second time, and added to the Bill.

New Clause 12

Warning notices

‘(1) OFCOM may give a notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) to a provider relating to a service or part of a service only after giving a warning notice to the provider that they intend to give such a notice relating to that service or that part of it.

(2) A warning notice under subsection (1) relating to the use of accredited technology (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a) and (3)(a)) must—

(a) contain details of the technology that OFCOM are considering requiring the provider to use,

(b) specify whether the technology is to be required in relation to terrorism content or CSEA content (or both),

(c) specify any other requirements that OFCOM are considering imposing (see section 106(2) to (4)),

(d) specify the period for which OFCOM are considering imposing the requirements (see section 106(6)),

(e) state that the provider may make representations to OFCOM (with any supporting evidence), and

(f) specify the period within which representations may be made.

(3) A warning notice under subsection (1) relating to the development or sourcing of technology (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) and (3)(b)) must—

(a) describe the proposed purpose for which the technology must be developed or sourced (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a)(iii) and (iv) and (3)(a)(ii)),

(b) specify steps that OFCOM consider the provider needs to take in order to comply with the requirement described in section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) or (3)(b), or both those requirements (as the case may be),

(c) specify the proposed period within which the provider must take each of those steps,

(d) specify any other requirements that OFCOM are considering imposing,

(e) state that the provider may make representations to OFCOM (with any supporting evidence), and

(f) specify the period within which representations may be made.

(4) A notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) that relates to both the user-to-user part of a combined service and the search engine of the service (as described in section (Notices to deal with terrorism content or CSEA content (or both))(4)(c) or (d)) may be given to the provider of the service only if—

(a) two separate warning notices have been given to the provider (one relating to the user-to-user part of the service and the other relating to the search engine), or

(b) a single warning notice relating to both the user-to-user part of the service and the search engine has been given to the provider.

(5) A notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) may not be given to a provider until the period allowed by the warning notice for the provider to make representations has expired.’—(Paul Scully.)

This clause, which would follow NC11, also replaces part of existing clause 104. There are additions to the warning notice procedure to take account of the new options for notices under NC11.

Brought up, read the First and Second time, and added to the Bill.

New Clause 20

OFCOM’s reports about news publisher content and journalistic content

‘(1) OFCOM must produce and publish a report assessing the impact of the regulatory framework provided for in this Act on the availability and treatment of news publisher content and journalistic content on Category 1 services (and in this section, references to a report are to a report described in this subsection).

(2) Unless the Secretary of State requires the production of a further report (see subsection (6)), the requirement in subsection (1) is met by producing and publishing one report within the period of two years beginning with the day on which sections (Duties to protect news publisher content) and 16 come into force (or if those sections come into force on different days, the period of two years beginning with the later of those days).

(3) A report must, in particular, consider how effective the duties to protect such content set out in sections (Duties to protect news publisher content) and 16 are at protecting it.

(4) In preparing a report, OFCOM must consult—

(a) persons who represent recognised news publishers,

(b) persons who appear to OFCOM to represent creators of journalistic content,

(c) persons who appear to OFCOM to represent providers of Category 1 services, and

(d) such other persons as OFCOM consider appropriate.

(5) OFCOM must send a copy of a report to the Secretary of State, and the Secretary of State must lay it before Parliament.

(6) The Secretary of State may require OFCOM to produce and publish a further report if the Secretary of State considers that the regulatory framework provided for in this Act is, or may be, having a detrimental effect on the availability and treatment of news publisher content or journalistic content on Category 1 services.

(7) But such a requirement may not be imposed—

(a) within the period of three years beginning with the date on which the first report is published, or

(b) more frequently than once every three years.

(8) For further provision about reports under this section, see section 138.

(9) In this section—

“journalistic content” has the meaning given by section 16;

“news publisher content” has the meaning given by section 49;

“recognised news publisher” has the meaning given by section 50.

(10) For the meaning of “Category 1 service”, see section 82 (register of categories of services).’—(Paul Scully.)

This inserts a new clause (after clause 135) which requires Ofcom to publish a report on the impact of the regulatory framework provided for in the Bill within two years of the relevant provisions coming into force. It also allows the Secretary of State to require Ofcom to produce further reports.

Brought up, read the First and Second time, and added to the Bill.

New Clause 40

Amendment of Enterprise Act 2002

‘In Schedule 15 to the Enterprise Act 2002 (enactments relevant to provisions about disclosure of information), at the appropriate place insert—

‘Online Safety Act 2022.”’—(Paul Scully.)



This amendment has the effect that the information gateway in section 241 of the Enterprise Act 2002 allows disclosure of certain kinds of information by a public authority (such as the Competition and Markets Authority) to OFCOM for the purposes of OFCOM’s functions under this Bill.

Brought up, read the First and Second time, and added to the Bill.

New Clause 42

Former providers of regulated services

‘(1) A power conferred by Chapter 6 of Part 7 (enforcement powers) to give a notice to a provider of a regulated service is to be read as including power to give a notice to a person who was, at the relevant time, a provider of such a service but who has ceased to be a provider of such a service (and that Chapter and Schedules 13 and 15 are to be read accordingly).

(2) “The relevant time” means—

(a) the time of the failure to which the notice relates, or

(b) in the case of a notice which relates to the requirement in section 90(1) to co-operate with an investigation, the time of the failure or possible failure to which the investigation relates.’—(Paul Scully.)

This new clause, which is intended to be inserted after clause 162, provides that a notice that may be given under Chapter 6 of Part 7 to a provider of a regulated service may also be given to a former provider of a regulated service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 43

Amendments of Part 4B of the Communications Act

‘Schedule (Amendments of Part 4B of the Communications Act) contains amendments of Part 4B of the Communications Act.’—(Paul Scully.)

This new clause introduces a new Schedule amending Part 4B of the Communications Act 2003 (see NS2).

Brought up, read the First and Second time, and added to the Bill.

New Clause 44

Repeal of Part 4B of the Communications Act: transitional provision etc

‘(1) Schedule (Video-sharing platform services: transitional provision etc) contains transitional, transitory and saving provision—

(a) about the application of this Act and Part 4B of the Communications Act during a period before the repeal of Part 4B of the Communications Act (or, in the case of Part 3 of Schedule (Video-sharing platform services: transitional provision etc), in respect of charging years as mentioned in that Part);

(b) in connection with the repeal of Part 4B of the Communications Act.

(2) The Secretary of State may by regulations make transitional, transitory or saving provision of the kind mentioned in subsection (1)(a) and (b).

(3) Regulations under subsection (2) may amend or repeal—

(a) Part 2A of Schedule3;

(b) Schedule (Video-sharing platform services: transitional provision etc).

(4) Regulations under subsection (2) may, in particular, make provision about—

(a) the application of Schedule (Video-sharing platform services: transitional provision etc) in relation to a service if the transitional period in relation to that service ends on a date before the date when section 172 comes into force;

(b) the application of Part 3 of Schedule (Video-sharing platform services: transitional provision etc), including further provision about the calculation of a provider’s non-Part 4B qualifying worldwide revenue for the purposes of paragraph 19 of that Schedule;

(c) the application of Schedule 10 (recovery of OFCOM’s initial costs), and in particular how fees chargeable under that Schedule may be calculated, in respect of charging years to which Part 3 of Schedule (Video-sharing platform services: transitional provision etc) relates.’—(Paul Scully.)

This new clause introduces a new Schedule containing transitional provisions (see NS3), and provides a power for the Secretary of State to make regulations containing further transitional provisions etc.

Brought up, read the First and Second time, and added to the Bill.

New Clause 51

Publication by providers of details of enforcement action

‘(1) This section applies where—

(a) OFCOM have given a person (and not withdrawn) any of the following—

(i) a confirmation decision;

(ii) a penalty notice under section 119;

(iii) a penalty notice under section 120(5);

(iv) a penalty notice under section 121(6), and

(b) the appeal period in relation to the decision or notice has ended.

(2) OFCOM may give to the person a notice (a “publication notice”) requiring the person to—

(a) publish details describing—

(i) the failure (or failures) to which the decision or notice mentioned in subsection (1)(a) relates, and

(ii) OFCOM’s response, or

(b) otherwise notify users of the service to which the decision or notice mentioned in subsection (1)(a) relates of those details.

(3) A publication notice may require a person to publish details under subsection (2)(a) or give notification of details under subsection (2)(b) or both.

(4) A publication notice must—

(a) specify the decision or notice mentioned in subsection (1)(a) to which it relates,

(b) specify or describe the details that must be published or notified,

(c) specify the form and manner in which the details must be published or notified,

(d) specify a date by which the details must be published or notified, and

(e) contain information about the consequences of not complying with the notice.

(5) Where a publication notice requires a person to publish details under subsection (2)(a) the notice may also specify a period during which publication in the specified form and manner must continue.

(6) Where a publication notice requires a person to give notification of details under subsection (2)(b) the notice may only require that notification to be given to United Kingdom users of the service (see section 184).

(7) A publication notice may not require a person to publish or give notification of anything that, in OFCOM’s opinion—

(a) is confidential in accordance with subsections (8) and (9), or

(b) is otherwise not appropriate for publication or notification.

(8) A matter is confidential under this subsection if—

(a) it relates specifically to the affairs of a particular body, and

(b) publication or notification of that matter would or might, in OFCOM’s opinion, seriously and prejudicially affect the interests of that body.

(9) A matter is confidential under this subsection if—

(a) it relates to the private affairs of an individual, and

(b) publication or notification of that matter would or might, in OFCOM’s opinion, seriously and prejudicially affect the interests of that individual.

(10) A person to whom a publication notice is given has a duty to comply with it.

(11) The duty under subsection (10) is enforceable in civil proceedings by OFCOM—

(a) for an injunction,

(b) for specific performance of a statutory duty under section 45 of the Court of Session Act 1988, or

(c) for any other appropriate remedy or relief.

(12) For the purposes of subsection (1)(b) “the appeal period”, in relation to a decision or notice mentioned in subsection (1)(a), means—

(a) the period during which any appeal relating to the decision or notice may be made, or

(b) where such an appeal has been made, the period ending with the determination or withdrawal of that appeal.’—(Paul Scully.)

This new clause, which is intended to be inserted after clause 129, gives OFCOM the power to require a person to whom a confirmation decision or penalty notice has been given to publish details relating to the decision or notice or to otherwise notify service users of those details.

Brought up, read the First and Second time, and added to the Bill.

New Clause 52

Exemptions from offence under section 152

‘(1) A recognised news publisher cannot commit an offence under section 152.

(2) An offence under section 152 cannot be committed by the holder of a licence under the Broadcasting Act 1990 or 1996 in connection with anything done under the authority of the licence.

(3) An offence under section 152 cannot be committed by the holder of a multiplex licence in connection with anything done under the authority of the licence.

(4) An offence under section 152 cannot be committed by the provider of an on-demand programme service in connection with anything done in the course of providing such a service.

(5) An offence under section 152 cannot be committed in connection with the showing of a film made for cinema to members of the public.’—(Paul Scully.)

This new clause contains exemptions from the offence in clause 152 (false communications). The clause ensures that holders of certain licences are only exempt if they are acting as authorised by the licence and, in the case of Wireless Telegraphy Act licences, if they are providing a multiplex service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 53

Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2)

‘(1) A person (A) commits an offence if—

(a) A sends a communication by electronic means which consists of or includes flashing images (see subsection (13)),

(b) either condition 1 or condition 2 is met, and

(c) A has no reasonable excuse for sending the communication.

(2) Condition 1 is that—

(a) at the time the communication is sent, it is reasonably foreseeable that an individual with epilepsy would be among the individuals who would view it, and

(b) A sends the communication with the intention that such an individual will suffer harm as a result of viewing the flashing images.

(3) Condition 2 is that, when sending the communication—

(a) A believes that an individual (B)—

(i) whom A knows to be an individual with epilepsy, or

(ii) whom A suspects to be an individual with epilepsy,

will, or might, view it, and

(b) A intends that B will suffer harm as a result of viewing the flashing images.

(4) In subsections (2)(a) and (3)(a), references to viewing the communication are to be read as including references to viewing a subsequent communication forwarding or sharing the content of the communication.

(5) The exemptions contained in section (Exemptions from offence under section 152) apply to an offence under subsection (1) as they apply to an offence under section 152.

(6) For the purposes of subsection (1), a provider of an internet service by means of which a communication is sent is not to be regarded as a person who sends a communication.

(7) In the application of subsection (1) to a communication consisting of or including a hyperlink to other content, references to the communication are to be read as including references to content accessed directly via the hyperlink.

(8) A person (A) commits an offence if—

(a) A shows an individual (B) flashing images by means of an electronic communications device,

(b) when showing the images—

(i) A knows that B is an individual with epilepsy, or

(ii) A suspects that B is an individual with epilepsy,

(c) when showing the images, A intends that B will suffer harm as a result of viewing them, and

(d) A has no reasonable excuse for showing the images.

(9) An offence under subsection (1) or (8) cannot be committed by a healthcare professional acting in that capacity.

(10) A person who commits an offence under subsection (1) or (8) is liable—

(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);

(b) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding six months or a fine not exceeding the statutory maximum (or both);

(c) on conviction on indictment, to imprisonment for a term not exceeding five years or a fine (or both).

(11) It does not matter for the purposes of this section whether flashing images may be viewed at once (for example, a GIF that plays automatically) or only after some action is performed (for example, pressing play).

(12) In this section—

(a) references to sending a communication include references to causing a communication to be sent;

(b) references to showing flashing images include references to causing flashing images to be shown.

(13) In this section—

“electronic communications device” means equipment or a device that is capable of transmitting images by electronic means;

“flashing images” means images which carry a risk that an individual with photosensitive epilepsy who viewed them would suffer a seizure as a result;

“harm” means—

(a) a seizure, or

(b) alarm or distress;

“individual with epilepsy” includes, but is not limited to, an individual with photosensitive epilepsy;

“send” includes transmit and publish (and related expressions are to be read accordingly).

(14) This section extends to England and Wales and Northern Ireland.’—(Paul Scully.)

This new clause creates (for England and Wales and Northern Ireland) a new offence of what is sometimes known as “epilepsy trolling” - sending or showing flashing images electronically to people with epilepsy intending to cause them harm.

Brought up, read the First and Second time, and added to the Bill.

New Clause 16

Communication offence for encouraging or assisting self-harm

‘(1) In the Suicide Act 1961, after section 3 insert—

“3A Communication offence for encouraging or assisting self-harm

(1) A person (“D”) commits an offence if—

(a) D sends a message,

(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and

(c) D’s act was intended to encourage or assist the infliction of serious physical harm.

(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.

(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.

(4) A person guilty of an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;

(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.

(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.

(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.

(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).

(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—

(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and

(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and

(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.”’—(Mr Davis.)

This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.

Brought up, and read the First time.

Question put, That the clause be read a Second time.

20:49

Division 107

Ayes: 242

Noes: 308

21:03
Proceedings interrupted (Programme Order, 20 March).
The Deputy Speaker put forthwith the Questions necessary for the disposal of the business to be concluded at that time (Standing Order No. 83E).
New Clause 17
Liability of directors for compliance failure
‘(1) This section applies where OFCOM considers that there are reasonable grounds for believing that a provider of a regulated service has failed, or is failing, to comply with any enforceable requirement (see section 112) that applies in relation to the service.
(2) If OFCOM considers that the failure results from any—
(a) action,
(b) direction,
(c) neglect, or
(d) with the consent’—(Dame Margaret Hodge.)
This new clause would enable Ofcom to exercise its enforcement powers under Chapter 6, Part 7 of the Bill against individual directors, managers and other officers at a regulated service provider where it considers the provider has failed, or is failing, to comply with any enforceable requirement.
Brought up.
Question put, That the clause be added to the Bill.
21:03

Division 108

Ayes: 238

Noes: 311

New Clause 28
Establishment of Advocacy Body
(1) There is to be a body corporate (“the Advocacy Body”) to represent interests of child users of regulated services.
(2) A “child user”—
(a) means any person aged 17 years or under who uses or is likely to
use regulated internet services; and
(b) includes both any existing child user and any future child user.
(3) The work of the Advocacy Body may include—
(a) representing the interests of child users;
(b) the protection and promotion of these interests;
(c) any other matter connected with those interests.
(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act,
including—
(a) safety duties about illegal content, in particular CSEA content;
(b) safety duties protecting children;
(c) “enforceable requirements” relating to children.
(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.
(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.
(7) The Advocacy Body must assess emerging threats to child users of regulated services and must bring information regarding these threats to OFCOM.
(8) The Advocacy Body may undertake research on their own account.
(9) The Secretary of State must either appoint an organisation known to represent children to be designated the functions under this Act, or create an organisation to carry out the designated functions.
(10) The budget of the Advocacy Body will be subject to annual approval by the board of OFCOM.
(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 71).”—(John Nicolson).
Brought up.
Question put, That the clause be added to the Bill.
21:16

Division 109

Ayes: 240

Noes: 312

Clause 47
Duties and the first codes of practice
Amendment made: 234, page 45, line 2, at end insert—
“(9) This section is subject to Part 2 of Schedule (Video-sharing platform services: transitional provision etc) (video-sharing platform services: transitional provision etc).” —(Paul Scully.)
This amendment ensures that clause 47 is subject to Part 2 of the new transitional provisions Schedule (see NS3) - otherwise clause 47 might have the effect that a provider of a service currently regulated by Part 4B of the Communications Act 2003 must comply with a safety duty during the transitional period.
Clause 84
OFCOM’s register of risks, and risk profiles, of Part 3 services
Amendments made: 102, page 72, line 28, leave out paragraph (a) and insert—
“(a) the risks of harm to individuals in the United Kingdom presented by illegal content present on regulated user-to-user services and by the use of such services for the commission or facilitation of priority offences;
(aa) the risk of harm to individuals in the United Kingdom presented by search content of regulated search services that is illegal content;”
This amendment ensures that OFCOM must prepare risk profiles relating to the use of user-to-user services for the commission or facilitation of priority offences.
Amendment 103, page 72, line 40, leave out from the second “the” to end of line and insert
“risk of harm mentioned in subsection (1)(b)”.
This technical amendment is consequential on Amendment 102.
Amendment 104, page 73, line 23, leave out “(1)(c)” and insert “(1)(a) or (c)”.
This technical amendment is consequential on Amendment 102.
Amendment 105, page 73, line 24, at end insert—
“(c) in the case of a risk assessment or risk profiles which relate only to the risk of harm mentioned in subsection (1)(aa), are to be read as references to regulated search services.”
This technical amendment is consequential on Amendment 102.
Amendment 106, page 73, line 36, at end insert—
““priority offence” has the same meaning as in Part 3 (see section 52).”—(Paul Scully.)
This amendment inserts a definition of “priority offence” into clause 84.
Clause 85
OFCOM’s guidance about risk assessments
Amendments made: 107, page 73, line 38, leave out subsection (1) and insert—
“(1) As soon as reasonably practicable after OFCOM have published the first risk profiles relating to the illegality risks, OFCOM must produce guidance to assist providers of regulated user-to-user services in complying with their duties to carry out illegal content risk assessments under section 8.
(1A) As soon as reasonably practicable after OFCOM have published the first risk profiles relating to the risk of harm from illegal content, OFCOM must produce guidance to assist providers of regulated search services in complying with their duties to carry out illegal content risk assessments under section 23.”
This amendment splits up OFCOM’s duty to produce guidance for providers about illegal content risk assessments, since, for user-to-user services, the effect of Amendment 102 is that such a risk assessment must also consider risks around the use of such services for the commission or facilitation of priority offences.
Amendment 108, page 74, line 11, leave out “(1) or”.
This technical amendment is consequential on Amendment 107.
Amendment 109, page 74, line 12, leave out “those subsections are” and insert “that subsection is”.
This technical amendment is consequential on Amendment 107.
Amendment 110, page 74, line 15, leave out “subsection (7)” and insert “this section”.
This technical amendment is consequential on Amendment 107.
Amendment 111, page 74, line 17, at end insert—
““illegality risks” means the risks mentioned in section 84(1)(a);”.
This amendment inserts a definition of “illegality risks” which is now used in clause 85.
Amendment 112, page 74, line 19, leave out “84(1)(a)” and insert “84(1)(aa)”.—(Paul Scully.)
This technical amendment is consequential on Amendment 102.
Clause 86
Power to require information
Amendment made: 113, page 75, line 38, at end insert—
“(fa) the purpose of assessing whether to give a notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) relating to the development or sourcing of technology (see subsections (2)(b) and (3)(b) of that section);”.—(Paul Scully.)
This amendment makes it clear that OFCOM have the power to require information to decide whether to give a notice under the clause inserted by NC11 which requires a provider to develop or source technology to deal with CSEA content.
Clause 89
Report by skilled persons
Amendments made: 114, page 77, line 36, leave out “either or both” and insert “any”.
This amendment is consequential on Amendment 116.
Amendment 115, page 77, line 39, leave out “or”.
This amendment is consequential on Amendment 116.
Amendment 116, page 77, line 43, at end insert—
“(c) assisting OFCOM in deciding whether to give a provider of a Part 3 service a notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) requiring the provider to use their best endeavours to develop or source technology dealing with CSEA content (see subsections (2)(b) and (3)(b) of that section), or assisting OFCOM in deciding the requirements to be imposed by such a notice.”—(Paul Scully.)
This amendment extends OFCOM’s power to require a skilled person’s report to cover assistance in relation to a notice under NC11 which requires a provider to develop or source technology to deal with CSEA content.
Clause 104
Amendment made: 117, page 87, line 9, leave out clause 104.—(Paul Scully.)
This amendment leaves out existing clause 104, which is replaced by NC11 and NC12.
Clause 105
Matters relevant to a decision to give a notice under section 104(1)
Amendments made: 118, page 88, line 40, at beginning insert
“In the case of a notice requiring the use of accredited technology,”.
This amendment ensures that the matters listed in clause 105(2) which OFCOM have to take account of in deciding whether to give a notice under NC11 apply just to such notices which require the use of accredited technology.
Amendment 119, page 89, line 25, at end insert—
“(3A) In the case of a notice relating to the development or sourcing of technology, subsection (2) applies—
(a) as if references to relevant content were to CSEA content, and
(b) with the omission of paragraphs (h), (i) and (j).” —(Paul Scully.)
This amendment sets out how the matters listed in clause 105(2) which OFCOM have to take account of in deciding whether to give a notice under NC11 apply to such notices which require the development or sourcing of technology to deal with CSEA content.
Clause 106
Notices under section 104(1): supplementary
Amendments made: 120, page 89, line 47, at end insert—
“(4A) A notice given to a provider of a Part 3 service requiring the use of accredited technology is to be taken to require the provider to make such changes to the design or operation of the service as are necessary for the technology to be used effectively.”
This amendment makes it clear that if OFCOM give a notice under NC11 requiring a provider to use accredited technology, that encompasses necessary design changes to a service.
Amendment 121, page 90, line 1, after “notice” insert
“requiring the use of accredited technology”.
This amendment ensures that requirements listed in clause 106(5) about the contents of a notice given under NC11 apply just to such notices which require the use of accredited technology.
Amendment 122, page 90, line 15, after “notice” insert
“requiring the use of accredited technology”.
This amendment is consequential on Amendment 121.
Amendment 123, page 90, line 17, at end insert—
“(6A) A notice relating to the development or sourcing of technology must—
(a) give OFCOM’s reasons for their decision to give the notice,
(b) describe the purpose for which technology is required to be developed or sourced (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a)(iii) and (iv) and (3)(a)(ii)),
(c) specify steps that the provider is required to take (including steps relating to the use of a system or process) in order to comply with the requirement described in section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) or (3)(b), or both those requirements (as the case may be),
(d) specify a reasonable period within which each of the steps specified in the notice must be taken,
(e) contain details of any other requirements imposed by the notice,
(f) contain details of the rights of appeal under section 140,
(g) contain information about when OFCOM intend to review the notice (see section 107), and
(h) contain information about the consequences of not complying with the notice (including information about the further kinds of enforcement action that it would be open to OFCOM to take).
(6B) In deciding what period or periods to specify for steps to be taken in accordance with subsection (6A)(d), OFCOM must, in particular, consider—
(a) the size and capacity of the provider, and
(b) the state of development of technology capable of achieving the purpose described in the notice in accordance with subsection (6A)(b).”
This amendment sets out the requirements which apply regarding the contents of a notice given under the NC11 requiring the development or sourcing of technology to deal with CSEA content.
Amendment 124, page 90, line 18, after “the” insert “design and”.
This amendment makes it clear that a notice given under NC11 may impose requirements about design of a service.
Amendment 125, page 90, line 24, leave out
“section 104 and this section”
and insert “this Chapter”.—(Paul Scully.)
This amendment is consequential on NC12.
Clause 107
Review and further notice under section 104(1)
Amendments made: 126, page 90, line 42, leave out from “must” to end of line 44 and insert
“carry out a review of the provider’s compliance with the notice—
(a) in the case of a notice requiring the use of accredited technology, before the end of the period for which the notice has effect;
(b) in the case of a notice relating to the development or sourcing of technology, before the last date by which any step specified in the notice is required to be taken.”
This amendment is consequential on NC11.
Amendment 127, page 90, line 45, leave out “The” and insert
“In the case of a notice requiring the use of accredited technology, the”.
This amendment is needed because the matters listed in the provision which is amended can only relate to a notice given under NC11 which requires the use of accredited technology.
Amendment 128, page 91, line 10, leave out
“require the use of different accredited technology from”
and insert “impose different requirements from”.
This amendment is needed because the provision which is amended is relevant to all notices given under NC11 (not just those which require the use of accredited technology).
Amendment 129, page 91, line 12, leave out
“Section 104(7) to (10) (warning notice) do”
and insert
“Section (Warning notices) (warning notices) does”.—(Paul Scully.)
This amendment is consequential on the warning notice procedure now being contained in NC12.
Clause 112
Requirements enforceable by OFCOM against providers of regulated services
Amendment 174, page 93, line 38, at end insert—

“Section (Duties to protect news publisher content)

News publisher content”

This amendment ensures that Ofcom are able to use their enforcement powers in Chapter 6 of Part 7 in relation to a breach of any of the duties set out in NC19.
Clause 115
Confirmation decisions: risk assessments
Amendments made: 130, page 96, line 40, leave out “illegal content” and insert
“matters required to be covered by an illegal content risk assessment”.
This amendment ensures that clause 115, which relates to a confirmation decision that may be given where a risk assessment is defective, covers matters in a risk assessment relating to the use of a service for commission or facilitation of priority offences, not just illegal content.
Amendment 131, page 96, line 41, after “9(2)” insert “(b) or (c)”.
This technical amendment is consequential on Amendment 61.
Amendment 132, page 96, line 44, leave out
“content that is harmful to children”
and insert
“matters required to be covered by a children’s risk assessment”.
This amendment brings clause 115(2)(b) (children’s risk assessments) into line with clause 115(2)(a) (illegal content risk assessments).
Amendment 133, page 97, line 15, leave out the definition of
“content that is harmful to children”.
This technical amendment is consequential on Amendment 132.
Amendment 134, page 97, line 17, leave out the definition of “illegal content”.—(Paul Scully.)
This technical amendment is consequential on Amendment 130.
Clause 119
Penalty for failure to comply with confirmation decision
Amendments made: 212, page 101, line 16, leave out “intend” and insert “propose”.
This amendment is a technical amendment and ensures that clause 119 uses the same terminology as used in other clauses in Chapter 6 of Part 7.
Amendment 213, page 101, line 19, at end insert “(with any supporting evidence)”.—(Paul Scully.)
This amendment provides that where OFCOM propose to give a penalty notice to a person in connection with a failure to comply with a confirmation decision, the representations that may be made to OFCOM before that notice is given may include supporting evidence.
Clause 120
Penalty for failure to comply with notice under section 104(1)
Amendment made: 135, page 101, line 37, leave out from beginning to “OFCOM”.—(Paul Scully.)
This is about a penalty notice which OFCOM may give for failure to comply with a notice given under NC11. The amendment omits words which are not apt to cover such a notice which relates to the development or sourcing of technology to deal with CSEA content.
Clause 129
Publication of details of enforcement action
Amendment made: 214, page 113, line 3, after “person” insert “(and not withdrawn)”.—(Paul Scully.)
This amendment provides that OFCOM’s duty to publish information following the giving of a confirmation decision or penalty notice to a person does not apply where the decision or notice has been withdrawn.
Clause 138
OFCOM’s reports
Amendment made: 175, page 118, line 29, at end insert—
“(aa) a report under section (OFCOM’s reports about news publisher content and journalistic content) (report about news publisher content and journalistic content),”.—(Paul Scully.)
This amendment ensures that the provisions about excluding confidential information from a report before publication apply to the duty to publish the report produced under NC20.
Clause 150
Review
Amendment made: 176, page 126, line 36, at end insert—
“(5A) In carrying out the review, the Secretary of State must take into account any report published by OFCOM under section (OFCOM’s reports about news publisher content and journalistic content) (reports about news publisher content and journalistic content).”—(Paul Scully.)
This amendment ensures that the Secretary of State is required to take into account Ofcom’s reports published under NC20 when carrying out the review under clause 150.
Page 127
Amendment made: 239, page 127, line 11, leave out clause 151.—(Paul Scully.)
This amendment omits clause 151, which had introduced a new offence relating to harmful communications.
Clause 152
False communications offence
Amendments made: 138, page 128, line 22, leave out subsections (4) and (5).
This amendment leaves out material which now appears, with changes, in NC52.
Amendment 240, page 128, line 29, at end insert—
“(5A) See section (Exemptions from offence under section 152) for exemptions from the offence under this section.”—(Paul Scully.)
This amendment adds a signpost to NC52.
Clause 153
Threatening communications offence
Amendment made: 215, page 129, line 29, leave out
“maximum summary term for either-way offences”
and insert
“general limit in a magistrates’ court”.—(Paul Scully.)
This amendment relates to the maximum term of imprisonment on summary conviction of an either-way offence in England and Wales. The amendment inserts a reference to the general limit in a magistrates’ court, meaning the time limit in section 224(1) of the Sentencing Code, which, currently, is 12 months.
Clause 154
Interpretation of sections 151 to 153
Amendments made: 241, page 129, line 33, leave out “151” and insert “152”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 242, page 129, line 34, leave out “any of those sections” and insert “section 152 or 153”.
This is a technical amendment to correct a reference, taking into account NC52.
Amendment 217, page 129, line 38, after “sends” insert
“, or gives to an individual,”.
This amendment clarifies that the new communications offences cover cases of giving (a letter etc) to an individual.
Amendment 218, page 129, line 43, at end insert
“, or
(ii) given to an individual.”
This amendment clarifies that the new communications offences cover cases of causing a letter etc to be given to an individual.
Amendment 243, page 130, line 10, leave out “151” and insert “152”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 219, page 130, line 10, leave out “, transmission or publication”.
This is a technical drafting change reflecting the fact that the reference in this provision to sending a message already covers cases of transmission or publication.
Amendment 244, page 130, line 16, leave out “151 or”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 245, page 130, line 18, leave out “151” and insert “152”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 220, page 130, line 21, at end insert
“(and in this subsection “sending” includes “giving”, and “sender” is to be read accordingly)”.
This amendment ensures that references to sending in a technical provision relating to the new communications offences include giving.
Amendment 221, page 130, line 23, leave out “, transmitted or published”.
This is a technical drafting change reflecting the fact that the reference in this provision to sending a message already covers cases of transmission or publication.
Amendment 140, page 130, line 24, at end insert—
“(9A) “Recognised news publisher” has the meaning given by section 50.
(9B) “Multiplex licence” means a licence under section 8 of the Wireless Telegraphy Act 2006 which authorises the provision of a multiplex service within the meaning of section 42(6) of that Act.”—(Paul Scully.)
This amendment adds definitions of terms used in NC52.
Clause 155
Extra-territorial application and jurisdiction
Amendments made: 246, page 130, line 31, leave out “151(1),”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 222, page 130, line 32, leave out “United Kingdom person” and insert “person within subsection (2)”.
This is a technical drafting improvement resulting from the introduction of the new epilepsy trolling offence which extends to Northern Ireland as well as England and Wales (see NC53).
Amendment 223, page 130, leave out line 33 and insert
“A person is within this subsection if the person is—”.
This is a technical drafting improvement resulting from the introduction of the new epilepsy trolling offence which extends to Northern Ireland as well as England and Wales (see NC53).
Amendment 224, page 130, line 36, at end insert—
“(2A) Section (Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland)(1) applies to an act done outside the United Kingdom, but only if the act is done by a person within subsection (2B).
(2B) A person is within this subsection if the person is—
(a) an individual who is habitually resident in England and Wales or Northern Ireland, or
(b) a body incorporated or constituted under the law of England and Wales or Northern Ireland.”
This amendment provides for extra-territorial application of the offence of sending flashing images electronically under the new clause inserted by NC53.
Amendment 247, page 130, line 37, leave out “151,”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 225, page 130, line 39, at end insert—
“(4) Proceedings for an offence committed under section (Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland)(1) outside the United Kingdom may be taken, and the offence may for incidental purposes be treated as having been committed, at any place in England and Wales or Northern Ireland.
(5) This section extends to England and Wales and Northern Ireland.”—(Paul Scully.)
This amendment provides for courts in England and Wales or Northern Ireland to have jurisdiction over an offence of sending flashing images electronically (see NC53) that is committed outside the United Kingdom.
Clause 156
Liability of corporate officers
Amendments made: 248, page 130, line 41, leave out “151,”.
Clause 156 is about the liability of corporate officers etc for offences. This amendment removes a reference to clause 151 (the harmful communications offence omitted by Amendment 239).
Amendment 226, page 130, line 41, leave out “or 153” and insert
“, 153 or (Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland)”.
Clause 156 is about the liability of corporate officers etc for offences. This amendment ensures that the provision applies to the epilepsy trolling offence inserted by NC53.
Amendment 227, page 131, line 9, at end insert—
“(3) This section extends to England and Wales and Northern Ireland.”—(Paul Scully.)
This amendment states the extent of clause 156.
Clause 158
Repeals in connection with offences under sections 151, 152 and 153
Amendments made: 249, page 132, line 3, leave out from beginning to end of line 4 and insert
“Section 127(2)(a) and (b) of the Communications Act (false messages) is repealed so far as it extends”.
This amendment, together with Amendment 250, provides for the repeal of section 127(2)(a) and (b) of the Communications Act 2003 for England and Wales, but not (as previously) also the repeal of section 127(1) of that Act.
Amendment 250, page 132, line 6, leave out paragraphs (a) and (b).
This amendment, together with Amendment 249, provides for the repeal of section 127(2)(a) and (b) of the Communications Act 2003 for England and Wales, but not (as previously) also the repeal of section 127(1) of that Act.
Amendment 251, page 132, line 8, leave out subsection (2) and insert—
“(2) The following provisions of the Malicious Communications Act 1988 are repealed—
(a) section 1(1)(a)(ii),
(b) section 1(1)(a)(iii), and
(c) section 1(2).”—(Paul Scully.)
This amendment provides for the repeal of the specified provisions of the Malicious Communications Act 1988, but not (as previously) the whole of that Act.
Clause 159
Consequential amendments
Amendments made: 252, page 132, line 10, leave out “151,”.
Clause 159 introduces a Schedule of consequential amendments. This amendment omits the reference to clause 151 (consequential on the omission of clause 151 (see Amendment 239)).
Amendment 228, page 132, line 11, leave out “and 153” and insert
“, 153 and (Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland)”.—(Paul Scully.)
Clause 159 introduces a Schedule of consequential amendments. This amendment adds a reference to the new epilepsy trolling offence (see NC53).
Clause 172
Repeal of Part 4B of the Communications Act
Amendment made: 229, page 139, line 8, at end insert—
“(3) In this Act, omit—
(a) section (Amendments of Part 4B of the Communications Act), and
(b) Schedule (Amendments of Part 4B of the Communications Act).
(4) In the Audiovisual Media Services (Amendment) (EU Exit) Regulations 2020 (S.I. 2020/1536), omit regulation 4.”—(Paul Scully.)
This amendment revokes enactments which amend Part 4B of the Communications Act 2003, which is repealed by clause 172.
Clause 182
Parliamentary procedure for regulations
Amendments made: 235, page 147, line 1, at end insert—
“(ca) regulations under section (Repeal of Part 4B of the Communications Act: transitional provision etc)(2),”.
This amendment provides for the affirmative procedure to apply to regulations under the new clause inserted by NC44.
Amendment 236, page 147, line 42, at end insert—
“(da) regulations under paragraph 6B(1) of Schedule 3, or”.—(Paul Scully.)
This amendment provides for the negative procedure to apply to regulations under paragraph 6B(1) of Schedule 3 (regulations setting a date when the requirements to carry out risk assessments etc begin for providers of services currently regulated by Part 4B of the Communications Act 2003).
Clause 196
Commencement and transitional provision
Amendment made: 237, page 161, line 39, at end insert—
“(3A) Regulations under subsection (2) may not bring section 172 into force before the end of the period of six months beginning with the date specified in regulations under paragraph 6B(1) of Schedule 3.”—(Paul Scully.)
Regulations under paragraph 6B(1) of Schedule 3 will set a date when the requirements to carry out risk assessments etc begin for providers of services currently regulated by Part 4B of the Communications Act 2003. This amendment ensures that Part 4B may not be repealed until at least 6 months after the chosen date (to give providers time to do their assessments before they become subject to the safety duties).
New Schedule 2
Amendments of Part 4B of the Communications Act
“1 Part 4B of the Communications Act (video-sharing platform services) is amended in accordance with this Schedule.
2 In section 368U (maintenance of list of providers)—
(a) omit subsection (2);
(b) for subsection (3) substitute—
‘(3) OFCOM must publish the up to date list on a publicly accessible part of their website.’
3 In section 368V(4) (meaning of ‘significant differences’), for the words from ‘the determination of jurisdiction’ to the end substitute ‘whether or not the person has the required connection with the United Kingdom under section 368S(2)(d)’.
4 In section 368Y(2)(d) (information to be provided by providers of video-sharing platform services), for the words from ‘under the jurisdiction’ to the end substitute ‘subject to regulation under this Part in respect of the video-sharing platform service that P provides’.
5 In section 368Z1(3) (duty to take appropriate measures), for the words from ‘of the description’ to the end substitute ‘to monitor the information which they transmit or store, or actively to seek to discover facts or circumstances indicating illegal activity’.
6 In section 368Z10(3)(a) (power to demand information), for the words from ‘falls under’ to the end substitute ‘has the required connection with the United Kingdom under section 368S(2)(d)’.
7 For section 368Z12 (co-operation with member States and the European Commission) substitute—
368Z12 Co-operation with EEA States
OFCOM may co-operate with EEA states which are subject to the Audiovisual Media Services Directive, and with the national regulatory authorities of such EEA states, for the following purposes—
(a) facilitating the carrying out by OFCOM of any of their functions under this Part; or
(b) facilitating the carrying out by the national regulatory authorities of the EEA states of any of their functions in relation to video-sharing platform services under that Directive as it has effect in EU law as amended from time to time.’”—(Paul Scully.)
This new Schedule amends Part 4B of the Communications Act 2003, which regulates video-sharing platform services. The amendments, which will apply during a transitional period prior to the repeal of Part 4B, are made in connection with the United Kingdom’s exit from the European Union.
Brought up, and added to the Bill.
New Schedule 3
Video-sharing platform services: transitional provision etc
“Part 1
Interpretation
1 (1) In this Schedule, “pre-existing Part 4B service” means—
(a) an internet service which—
(i) is a video-sharing platform service by reason of the conditions in section 368S(1) and (2) of the Communications Act being met in relation to the service as a whole, and
(ii) was being provided immediately before this Schedule comes into force; or
(b) a dissociable section of an internet service, where that dissociable section—
(i) is a video-sharing platform service by reason of the conditions in section 368S(1)(a) and (2) of the Communications Act being met in relation to that dissociable section, and
(ii) was being provided immediately before this Schedule comes into force.
(2) In sub-paragraph (1), any reference to a service provided before this Schedule comes into force includes a reference to a service provided in breach of the requirement in section 368V of the Communications Act.
2 In this Schedule—
“the relevant day”, in relation to a pre-existing Part 4B service or to a service which includes a pre-existing Part 4B service, means—
(a) the date when section 172 comes into force (repeal of Part 4B of the Communications Act), or
(b) if the pre-existing Part 4B service ceases to be a video-sharing platform service before the date mentioned in paragraph (a), the date when that service ceases to be a video-sharing platform service;
“safety duties” means the duties mentioned in section 6(2), (4) and (5), except the duties set out in—
(a) section 8 (illegal content risk assessments),
(b) section 10 (children’s risk assessments),
(c) section 12 (adults’ risk assessments), and
(d) section 20(2) (records of risk assessments);
“the transitional period”, in relation to a pre-existing Part 4B service or to a service which includes a pre-existing Part 4B service, means the period—
(a) beginning with the date when this Schedule comes into force, and
(b) ending with the relevant day;
“video-sharing platform service” has the same meaning as in Part 4B of the Communications Act (see section 368S of that Act).
Part 2
During the transitional period
Pre-existing Part 4B services which are regulated user-to-user services
3 (1) This paragraph applies in relation to a pre-existing Part 4B service which—
(a) is within the definition in paragraph (a) of paragraph 1(1), and
(b) is also a regulated user-to-user service.
(2) Both this Act and Part 4B of the Communications Act apply in relation to the pre-existing Part 4B service during the transitional period.
(3) But that is subject to—
(a) sub-paragraph (4),
(b) sub-paragraph (5), and
(c) paragraph 4.
(4) The following duties and requirements under this Act do not apply during the transitional period in relation to the pre-existing Part 4B service—
(a) the safety duties;
(b) the duties set out in section 34 (fraudulent advertising);
(c) the duties set out in section 57 (user identity verification);
(d) the requirements under section 59(1) and (2) (reporting CSEA content to the NCA);
(e) the duty on OFCOM to give a notice under section 64(1) requiring information in a transparency report;
(f) the requirements to produce transparency reports under section 64(3) and (4).
(5) OFCOM’s powers under Schedule 12 to this Act (powers of entry, inspection and audit) do not apply during the transitional period in relation to the pre-existing Part 4B service.
(6) In sub-paragraph (2) the reference to this Act does not include a reference to Part 6 (fees); for the application of Part 6, see Part 3 of this Schedule.
Regulated user-to-user services that include regulated provider pornographic content
4 (1) The duties set out in section 68 of this Act do not apply during the transitional period in relation to any regulated provider pornographic content published or displayed on a pre-existing Part 4B service.
(2) In the case of a regulated user-to-user service which includes a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1), nothing in sub-paragraph (1) is to be taken to prevent the duties set out in section 68 from applying during the transitional period in relation to any regulated provider pornographic content published or displayed on any other part of the service.
(3) In this paragraph ‘regulated provider pornographic content’ and ‘published or displayed’ have the same meaning as in Part 5 of this Act (see section 66).
Pre-existing Part 4B services which form part of regulated user-to-user services
5 (1) During the transitional period, Part 4B of the Communications Act applies in relation to a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1).
(2) Sub-paragraph (3), and paragraphs 6 to 8, apply in relation to a regulated user-to-user service which includes a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1).
(3) During the transitional period, this Act applies in relation to the regulated user-to-user service with the modifications set out in paragraph 6, 7, or 8 (whichever applies).
(4) In paragraphs 6 to 8 the dissociable section of the service which is the pre-existing Part 4B service is referred to as ‘the Part 4B part’.
(5) In sub-paragraph (3) the reference to this Act does not include a reference to Part 6 (fees); for the application of Part 6, see Part 3 of this Schedule.
Regulated user-to-user services with a Part 4B part and another user-to-user part
6 (1) This paragraph applies in relation to a regulated user-to-user service described in paragraph 5(2) if the service would still be a regulated user-to-user service even if the Part 4B part were to be assumed not to be part of the service.
(2) During the transitional period—
(a) any duty or requirement mentioned in paragraph 3(4) which applies in relation to the regulated service is to be treated as applying only in relation to the rest of the service;
(b) the powers mentioned in paragraph 3(5) are to be treated as applying only in relation to the rest of the service.
(3) In this paragraph ‘the rest of the service’ means any user-to-user part of the regulated service other than the Part 4B part.
Regulated user-to-user services with a Part 4B part and a search engine
7 (1) This paragraph applies in relation to a regulated user-to-user service described in paragraph 5(2) if the service would be a regulated search service if the Part 4B part were to be assumed not to be part of the service.
(2) During the transitional period, no duty or requirement mentioned in paragraph 3(4) applies in relation to the Part 4B part of the service (but that is not to be taken to prevent any other duty or requirement under this Act from applying in relation to the search engine of the service during the transitional period).
(3) During the transitional period, the powers mentioned in paragraph 3(5) are to be treated as applying only in relation to the search engine of the service.
Regulated user-to-user services with a Part 4B part but no other user-to-user part or search engine
8 (1) This paragraph applies in relation to a regulated user-to-user service described in paragraph 5(2) if the service does not fall within paragraph 6 or 7.
(2) The duties, requirements and powers mentioned in paragraph 3(4) and (5) do not apply in relation to the regulated service during the transitional period.
Risk assessments and children’s access assessments of pre-existing Part 4B services or of services which include a pre-existing Part 4B service
9 See Part 2A of Schedule 3 for provision about—
(a) the timing of risk assessments and children’s access assessments of pre-existing Part 4B services, and
(b) modifications of Parts 1 and 2 of that Schedule in connection with risk assessments and children’s access assessments of services which include a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1).
Operation of section 368U of the Communications Act
10 During the transitional period, section 368U of the Communications Act has effect as a requirement to establish and maintain an up to date list of persons providing a video-sharing platform service to which Part 4B applies.
Video-sharing platform services which start up, or start up again, during the transitional period
11 Part 4B of the Communications Act does not apply in relation to a video-sharing platform service which is first provided on or after the date when this Schedule comes into force.
12 (1) Sub-paragraph (2) applies in relation to a pre-existing Part 4B service if—
(a) the service ceases to be a video-sharing platform service on a date within the transitional period, and
(b) the service begins again to be a video-sharing platform service on some later date within the transitional period.
(2) Part 4B of the Communications Act does not start applying again in relation to the service on the date mentioned in sub-paragraph (1)(b).
13 Paragraphs 11 and 12 apply regardless of whether, or when, a provider of a service has notified the appropriate regulatory authority in accordance with section 368V of the Communications Act.
Part 3
Application of Part 6 of this Act: fees
Introduction
14 This Part makes provision about the application of the following provisions of this Act in relation to a person who is the provider of a relevant regulated service—
(a) section 70 (duty to notify OFCOM in relation to the charging of fees);
(b) section 71 (payment of fees);
(c) Schedule 10 (additional fees).
15 In this Part ‘relevant regulated service’ means—
(a) a regulated user-to-user service which is a pre-existing Part 4B service within the definition in paragraph (a) of paragraph 1(1), or
(b) a regulated user-to-user service which includes a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1).
Application of section 70
16 (1) Sub-paragraph (2) applies in relation to a person who is the provider of a relevant regulated service, whether or not the person is the provider of any other regulated service.
(2) Section 70, which makes provision about the notification of OFCOM in relation to a charging year, applies to the provider in relation to every charging year, regardless of whether any part, or all, of a charging year falls within the transitional period.
17 (1) This paragraph applies in relation to a person who is the provider of a relevant regulated service, unless the person is an exempt provider (see paragraph 24).
(2) Sub-paragraph (3) applies in relation to the provider if—
(a) the provider is required by section 70 to give details to OFCOM of the provider’s qualifying worldwide revenue for the qualifying period that relates to a charging year,
(b) the provider gives such details in relation to that charging year at a time within the transitional period, and
(c) no regulations under section 196(2) have been made before that time specifying that section 172 is to come into force on or before the first day of that charging year.
(3) The provider’s notification under section 70 about qualifying worldwide revenue must include a breakdown indicating the amounts which are wholly referable to a relevant Part 4B service (if any).
Application of section 71: transitional charging year
18 If a person who is the provider of a relevant regulated service is an exempt provider, section 71 and Schedule 10 do not apply in relation to the provider in respect of a transitional charging year (see paragraph 23).
19 (1) If a person who is the provider of a relevant regulated service is not an exempt provider, section 71 and Schedule 10 apply in relation to the provider in respect of a transitional charging year.
(2) But sub-paragraphs (3) and (4) apply in relation to the provider in respect of a transitional charging year if the provider’s notification under section 70 in relation to that charging year has included details of amounts wholly referable to a relevant Part 4B service (as mentioned in paragraph 17(3)).
(3) For the purposes of the computation of the provider’s fee under section 71 in respect of the transitional charging year, references in that section to the provider’s qualifying worldwide revenue are to be taken to be references to the provider’s non-Part 4B qualifying worldwide revenue.
(4) OFCOM may not require the provider to pay a fee under section 71 in respect of the transitional charging year if the provider’s non-Part 4B qualifying worldwide revenue for the qualifying period that relates to that charging year is less than the threshold figure that has effect for that charging year.
(5) The amount of a provider’s ‘non-Part 4B qualifying worldwide revenue’ is the amount that would be the provider’s qualifying worldwide revenue (see section 72) if all amounts wholly referable to a relevant Part 4B service were left out of account.
Application of section 71: non-transitional charging year
20 (1) Sub-paragraph (2) applies in relation to a person who is the provider of a relevant regulated service, whether or not the person is the provider of any other regulated service.
(2) Section 71 and Schedule 10 apply without modification in relation to the provider in respect of a non-transitional charging year (even if the notification date in relation to such a charging year fell within the transitional period).
Amounts wholly referable to relevant Part 4B service
21 (1) For the purposes of this Part, OFCOM may produce a statement giving information about the circumstances in which amounts do, or do not, count as being wholly referable to a relevant Part 4B service.
(2) If OFCOM produce such a statement, they must publish it (and any revised or replacement statement).
Interpretation of this Part
22 In this Part—
“non-transitional charging year” means a charging year which is not a transitional charging year;
“notification date”, in relation to a charging year, means the latest date by which a notification under section 70 relating to that charging year is required to be given (see section 70(5));
“relevant Part 4B service” means—
(a) a regulated user-to-user service described in paragraph 15(a), or
(b) a pre-existing Part 4B service included in a regulated user-to-user service described in paragraph 15(b).
23 For the purposes of this Part a charging year is a “transitional charging year”
if—
(a) the notification date in relation to that charging year fell within the transitional period, and
(b) no regulations under section 196(2) were made before the notification date specifying that section 172 was to come into force on or before the first day of that charging year.
24 (1) In this Part “exempt provider” means a person within sub-paragraph (2) or (3).
(2) A person is within this sub-paragraph if the person is the provider of only one regulated service, and that service is—
(a) a regulated user-to-user service which is a pre-existing Part 4B service within the definition in paragraph (a) of paragraph 1(1), or
(b) a regulated user-to-user service which—
(i) includes a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 1(1), and
(ii) does not fall within paragraph 6 or 7.
(3) A person is within this sub-paragraph if the person is the provider of more than one regulated service, if each regulated service is of a kind described in sub-paragraph (2).
25 In this Part the following terms have the same meaning as in Part 6 of this Act—
“charging year”;
“qualifying period”;
“threshold figure”.
Part 4
After the end of the transitional period
Interpretation of this Part
26 In this Part of this Schedule—
(a) “the repeal time” means the time when section 172 of this Act comes into force (repeal of Part 4B of the Communications Act);
(b) (except in paragraph (a)) references to sections are to sections of the Communications Act.
27 For the purposes of this Part an investigation relating to a person begins when OFCOM notify the person to that effect.
OFCOM as appropriate regulatory authority
28 The repeal of section 368T does not affect OFCOM’s powers to act after the repeal time as the appropriate regulatory authority under Part 4B of the Communications Act as it has effect by virtue of this Part of this Schedule.
Duties of service providers to co-operate with investigations
29 The repeal of section 368Y(3)(c) (duty to co-operate) does not affect the application of that provision after the repeal time in relation to—
(a) an investigation as mentioned in section 368Z10(3)(f) begun before that time, or
(b) any demand for information for the purpose mentioned in section 368Z10(3)(i) resulting from such an investigation.
Demands for information, and enforcement of such demands
30 (1) The repeal of sections 368Y(3)(b) and 368Z10 (demands for information) does not affect the application of those provisions after the repeal time in a case in which—
(a) OFCOM require information after the repeal time for the purposes of an investigation as mentioned in section 368Z10(3)(f), and
(b) the investigation was begun before that time.
(2) The repeal of sections 368Z2, 368Z4 and 368Z10 does not affect the application of those sections after the repeal time in connection with—
(a) a failure to comply with a requirement under section 368Z10 imposed before that time, or
(b) a failure to comply with a requirement imposed after that time under section 368Z10 as it has effect in a case mentioned in subparagraph (1).
(3) In this paragraph—
(a) “the purposes of an investigation” include the purposes of any enforcement action or proceedings resulting from an investigation;
(b) references to sections 368Z2 and 368Z4 include references to those sections as modified by section 368Z10.
Enforcement notifications, financial penalties etc
31 (1) The repeal of sections 368W and 368Z4 (enforcement of section 368V) does not affect the application of those sections after the repeal time in a case in which OFCOM—
(a) made a determination as mentioned in section 368W(1) before that time, or
(b) began, before that time, to investigate whether they may have grounds to make such a determination.
(2) The repeal of sections 368Z2 and 368Z4 (enforcement of sections 368Y and 368Z1(6) and (7)) does not affect the application of those sections after the repeal time in a case in which OFCOM—
(a) made a determination as mentioned in section 368Z2(1) before that time, or
(b) began, before that time, to investigate whether they may have grounds to make such a determination.
(3) The repeal of sections 368Z3 and 368Z4 (enforcement of sections 368Z1(1) and (2)) does not affect the application of those sections after the repeal time in a case in which OFCOM—
(a) made a determination as mentioned in section 368Z3(1) before that time, or
(b) began, before that time, to investigate whether they may have grounds to make such a determination.
Suspension or restriction of service for contraventions or failures
32 (1) The repeal of section 368Z5 (suspension or restriction of service for contraventions or failures) does not affect the application of that section after the repeal time in a case in which OFCOM—
(a) made a determination as mentioned in section 368W(1), 368Z2(1) or 368Z3(1) before that time, or
(b) made such a determination after that time following an investigation begun before that time.
(2) The repeal of section 368Z5 does not affect the application of that section (as modified by section 368Z10) after the repeal time in a case in which—
(a) OFCOM are satisfied that a person failed to comply with a requirement under section 368Z10 imposed before that time, or
(b) OFCOM are satisfied that a person failed to comply with a requirement imposed after that time under section 368Z10 as it has effect in a case mentioned in paragraph 30(1).
(3) The repeal of sections 368Z7 (directions under sections 368Z5 and 368Z6) and 368Z8 (offence relating to such directions) does not affect the application of those sections after the repeal time in connection with a direction given under section 368Z5 as it has effect by virtue of this paragraph.”—(Paul Scully.)
Parts 2 and 3 of this new Schedule contain transitional provisions etc dealing with how services currently regulated by Part 4B of the Communications Act 2003 (“video-sharing platform services”) make the transition to regulation under the Online Safety Bill. Part 4 of this new Schedule contains saving provisions operating after the repeal of Part 4B.
Brought up, and added to the Bill.
Schedule 3
Timing of providers’ assessments
Amendment made: 238, page 175, line 11, at end insert—
“Part 2A
Pre-existing Part 4B Services
Interpretation of this Part
6A (1) In this Part, “pre-existing Part 4B service” means—
(a) an internet service which—
(i) is a video-sharing platform service by reason of the conditions in section 368S(1) and (2) of the Communications Act being met in relation to the service as a whole, and
(ii) was being provided immediately before Schedule (Video-sharing platform services: transitional provision etc) (video-sharing platform services: transitional provision etc) comes into force; or
(b) a dissociable section of an internet service, where that dissociable section—
(i) is a video-sharing platform service by reason of the conditions in section 368S(1)(a) and (2) of the Communications Act being met in relation to that dissociable section, and
(ii) was being provided immediately before Schedule (Video-sharing platform services: transitional provision etc) comes into force.
(2) In sub-paragraph (1), any reference to a service provided before Schedule (Video-sharing platform services: transitional provision etc) comes into force includes a reference to a service provided in breach of the requirement in section 368V of the Communications Act.
6B (1) In this Part, “assessment start day”, in relation to a pre-existing Part 4B service, means—
(a) the date specified in regulations made by the Secretary of State for the purposes of this Part of this Schedule, or
(b) if the pre-existing Part 4B service ceases to be a video-sharing platform service before the date specified in the regulations, the date when that service ceases to be a video-sharing platform service.
(2) But in respect of any period during which this Schedule is fully in force and no regulations under sub-paragraph (1) have yet been made, the definition in sub-paragraph (1) has effect as if—
(a) for paragraph (a) there were substituted “the date when section 172 comes into force”, and
(b) in paragraph (b), for “specified in the regulations” there were substituted “when section 172 comes into force”.
6C In this Part “video-sharing platform service” has the same meaning as in Part 4B of the Communications Act (see section 368S of that Act).
6D Any reference in this Part to the effect of Part 1 or 2 of this Schedule is a reference to the effect that Part 1 or 2 would have if this Part were disregarded.
Pre-existing Part 4B services which are regulated user-to-user services
Application of paragraphs 6F to 6H
6E (1) This paragraph and paragraphs 6F to 6H apply in relation to a pre-existing Part 4B service which—
(a) is within the definition in paragraph (a) of paragraph 6A(1), and
(b) is also a regulated user-to-user service.
(2) If the effect of Part 1 of this Schedule is that the period within which the first illegal content risk assessment or CAA of the service must be completed begins on a day before the assessment start day, the time for carrying out that assessment is extended as set out in paragraph 6F or 6G.
(3) If the effect of paragraph 6 is that the period within which the first adults’ risk assessment of the service must be completed begins on a day before the assessment start day, the time for carrying out that risk assessment is extended as set out in paragraph 6H.
(4) But paragraphs 6F to 6H do not apply if the service ceases to be a regulated user-to-user service on the assessment start day.
Illegal content risk assessments and children’s access assessments
6F (1) Sub-paragraphs (2) and (3) apply in relation to the service if, on the assessment start day, illegal content risk assessment guidance is available but the first CAA guidance has not yet been published.
(2) The first illegal content risk assessment of the service must be completed within the period of three months beginning with the assessment start day.
(3) The first CAA of the service must be completed within the period of three months beginning with the day on which the first CAA guidance is published.
6G If, on the assessment start day, illegal content risk assessment guidance and CAA guidance are both available, both of the following must be completed within the period of three months beginning with that day—
(a) the first illegal content risk assessment of the service, and
(b) the first CAA of the service.
Adults’ risk assessments
6H (1) If adults’ risk assessment guidance is available on the assessment start day, the first adults’ risk assessment of the service must be completed within the period of three months beginning with that day.
(2) If, on the assessment start day, the first adults’ risk assessment guidance has not yet been published, the first adults’ risk assessment of the service must be completed within the period of three months beginning with the day on which the first adults’ risk assessment guidance is published.
Regulated user-to-user services which include a pre-existing Part 4B service
Application of paragraphs 6J to 6N
6I (1) Paragraphs 6J to 6N make provision about the timing of assessments in the case of a regulated user-to-user service which includes a pre-existing Part 4B service within the definition in paragraph (b) of paragraph 6A(1).
(2) In sub-paragraph (3) and paragraphs 6J to 6N—
(a) “the regulated service” means the regulated user-to-user service, and
(b) “the Part 4B part” means the pre-existing Part 4B service which is included in the regulated service.
(3) If the effect of Part 1 or paragraph 6 of this Schedule is that the period within which the first illegal content risk assessment, CAA or adults’ risk assessment of the regulated service must be completed begins on a day before the assessment start day—
(a) the time for carrying out the assessment in question in relation to the Part 4B part is extended as set out in paragraph 6J, 6K or 6L (whichever applies),
(b) Part 1 and paragraph 6 apply as set out in paragraph 6M, and
(c) paragraph 5 applies as set out in paragraph 6N.
(4) But paragraphs 6J to 6N do not apply if the service ceases to be a regulated user-to-user service on the assessment start day.
Illegal content risk assessments and children’s access assessments of Part 4B part
6J (1) Sub-paragraphs (2) and (3) apply in relation to the Part 4B part if, on the assessment start day, illegal content risk assessment guidance is available but the first CAA guidance has not yet been published.
(2) The first illegal content risk assessment of the Part 4B part must be completed within the period of three months beginning with the assessment start day.
(3) The first CAA of the Part 4B part must be completed within the period of three months beginning with the day on which the first CAA guidance is published.
6K If, on the assessment start day, illegal content risk assessment guidance and CAA guidance are both available, both of the following must be completed within the period of three months beginning with that day—
(a) an illegal content risk assessment of the Part 4B part, and
(b) a CAA of the Part 4B part.
Adults’ risk assessments of Part 4B part
6L (1) If adults’ risk assessment guidance is available on the assessment start day, an adults’ risk assessment of the Part 4B part must be completed within the period of three months beginning with that day.
(2) If, on the assessment start day, the first adults’ risk assessment guidance has not yet been published, an adults’ risk assessment of the Part 4B part must be completed within the period of three months beginning with the day on which the first adults’ risk assessment guidance is published.
Application of Part 1 and paragraph 6
6M (1) This paragraph applies in relation to—
(a) an illegal content risk assessment or a CAA of the regulated service if an assessment of that kind is due to be carried out in relation to the Part 4B part of the service in accordance with paragraph 6J or 6K;
(b) an adults’ risk assessment of the regulated service if an adults’ risk assessment is due to be carried out in relation to the Part 4B part of the service in accordance with paragraph 6L.
References in the rest of this paragraph to an illegal content risk assessment, a CAA or an adults’ risk assessment are to an assessment of that kind to which this paragraph applies.
(2) For the purposes of this paragraph—
(a) the regulated service is “type 1” if it would still be a regulated user-to-user service even if the Part 4B part were to be assumed not to be part of the service;
(b) the regulated service is “type 2” if it would be a regulated search service if the Part 4B part were to be assumed not to be part of the service;
(c) the regulated service is “type 3” if it does not fall within paragraph (a) or (b).
(3) If the regulated service is type 1, an illegal content risk assessment, a CAA or an adults’ risk assessment is to be treated as being due at the time provided for by Part 1 or paragraph 6 only in relation to the rest of the service.
(4) In sub-paragraph (3) “the rest of the service” means any user-to-user part of the regulated service other than the Part 4B part.
(5) If the regulated service is type 2—
(a) an illegal content risk assessment is not required to be carried out at the time provided for by Part 1, but that is not to be taken to prevent an illegal content risk assessment as defined by section 23 from being due in relation to the search engine of the service at the time provided for by Part 1;
(b) a CAA is to be treated as being due at the time provided for by Part 1 only in relation to the search engine of the service;
(c) an adults’ risk assessment is not required to be carried out at the time provided for by paragraph 6.
(6) If the regulated service is type 3, no illegal content risk assessment, CAA or adults’ risk assessment is required to be carried out at the time provided for by Part 1 or paragraph 6.
Application of paragraph 5
6N (1) This paragraph sets out how paragraph 5 (children’s risk assessments) is to apply if a CAA is required to be carried out in accordance with—
(a) paragraph 6J or 6K (CAA of Part 4B part of a service),
(b) paragraph 6M(3) (CAA of the rest of a service), or
(c) paragraph 6M(5)(b) (CAA of search engine of a service).
(2) The definition of “the relevant day” is to operate by reference to the CAA that was (or was required to be) carried out, and accordingly, references to the day on which the service is to be treated as likely to be accessed by children are to be read as references to the day on which the Part 4B part of the service, the rest of the service or the search engine of the service (as the case may be) is to be treated as likely to be accessed by children.
(3) References to a children’s risk assessment of the service are to a children’s risk assessment of the Part 4B part of the service, the rest of the service or the search engine of the service (as the case may be).”—(Paul Scully.)
This amendment deals with the timing of risk assessments etc to be carried out by providers of services currently regulated by Part 4B of the Communications Act 2003. The requirement to do the assessments is triggered on the date set in regulations under new paragraph 6B(1) of Schedule 3.
Schedule 13
Penalties imposed By OFCOM under Chapter 6 of Part 7
Amendment made: 230, page 212, leave out lines 13 to 18.—(Paul Scully.)
This amendment is consequential on NC42.
Schedule 14
Amendments consequential on offences in Part 10 of this Act
Amendments made: 253, page 212, line 36, at end insert—
“Football Spectators Act 1989
A1 In Schedule 1 to the Football Spectators Act 1989 (football banning orders: relevant offences), after paragraph 1(y) insert—
(z) any offence under section 152 (false communications) or 153 (threatening communications) of the Online Safety Act 2022—
(i) which does not fall within paragraph (d), (e), (m), (n), (r) or (s),
(ii) as respects which the court has stated that the offence is aggravated by hostility of any of the types mentioned in section 66(1) of the Sentencing Code (racial hostility etc), and
(iii) as respects which the court makes a declaration that the offence related to a football match, to a football organisation or to a person whom the accused knew or believed to have a prescribed connection with a football organisation.””
This amendment concerns offences relevant to the making of football banning orders. The new false and threatening communications offences under this Bill are added for that purpose.
Amendment 254, page 212, line 40, leave out paragraph (a).
This amendment has the effect of retaining a reference to section 127(1) of the Communications Act 2003 in the Sexual Offences Act 2003.
Amendment 255, page 213, leave out lines 2 and 3.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 256, page 213, line 4, leave out “63E” and insert “63D”.
This amendment is consequential on Amendment 255.
Amendment 257, page 213, line 4, leave out “that Act” and insert
“the Online Safety Act 2022”.
This amendment is consequential on Amendment 255.
Amendment 258, page 213, line 6, leave out “63F” and insert “63E”.
This amendment is consequential on Amendment 255.
Amendment 259, page 213, line 12, leave out paragraph (a).
This amendment has the effect of retaining a reference to the Malicious Communications Act 1988 in the Regulatory Enforcement and Sanctions Act 2008.
Amendment 260, page 213, line 15, leave out “151,”.
This amendment is consequential on the omission of clause 151 (see Amendment 239).
Amendment 261, page 213, line 15, at end insert—
“Elections Act 2022
2A In Schedule 9 to the Elections Act 2022 (offences for purposes of Part 5), in Part 2, after paragraph 52 insert—
“Online Safety Act 2022
52A An offence under any of the following provisions of the Online Safety Act 2022—
(a) section 152 (false communications);
(b) section 153 (threatening communications);
(c) section (Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland) (sending flashing images).””
This amendment concerns offences relevant for Part 5 of the Elections Act 2022 (disqualification from holding elective office). The new false and threatening communications offences under this Bill, and the new epilepsy trolling offence (see NC53), are added for that purpose.
Amendment 233, page 214, line 23, at end insert—
“Elections Act 2022
9 In Schedule 9 to the Elections Act 2022 (offences for purposes of Part 5), after paragraph 47(f) insert—section 66A (sending etc photograph or film of genitals).””
(i) section 66A (sending etc photograph or film of genitals).””—(Paul Scully.)
This amendment concerns offences relevant for Part 5 of the Elections Act 2022 (disqualification from holding elective office). The amendment adds a reference to the new offence (cyber-flashing) inserted into the Sexual Offences Act 2003 by clause 157 of this Bill.