All 6 Paul Scully contributions to the Online Safety Act 2023

Read Bill Ministerial Extracts

Mon 5th Dec 2022
Tue 13th Dec 2022
ONLINE SAFETY BILL (First sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 1st sitting
Tue 13th Dec 2022
ONLINE SAFETY BILL (Second sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 2nd sitting
Thu 15th Dec 2022
ONLINE SAFETY BILL (Third sitting)
Public Bill Committees

Committee stage (re-committed clauses and schedules): 3rd sitting
Tue 17th Jan 2023
Tue 12th Sep 2023
Online Safety Bill
Commons Chamber

Consideration of Lords amendments

Online Safety Bill

Paul Scully Excerpts
Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

With this it will be convenient to discuss the following:

Government new clause 12—Warning notices.

Government new clause 20—OFCOM’s reports about news publisher content and journalistic content.

Government new clause 40—Amendment of Enterprise Act 2002.

Government new clause 42—Former providers of regulated services.

Government new clause 43—Amendments of Part 4B of the Communications Act.

Government new clause 44—Repeal of Part 4B of the Communications Act: transitional provision etc.

Government new clause 51—Publication by providers of details of enforcement action.

Government new clause 52—Exemptions from offence under section 152.

Government new clause 53—Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2).

New clause 1—Provisional re-categorisation of a Part 3 service

“(1) This section applies in relation to OFCOM’s duty to maintain the register of categories of regulated user-to-user services and regulated search services under section 83.

(2) If OFCOM—

(a) consider that a Part 3 service not included in a particular part of the register is likely to meet the threshold conditions relevant to that part, and

(b) reasonably consider that urgent application of duties relevant to that part is necessary to avoid or mitigate significant harm,

New clause 16—Communication offence for encouraging or assisting self-harm

“(1) In the Suicide Act 1961, after section 3 insert—

“3A Communication offence for encouraging or assisting self-harm

(1) A person (“D”) commits an offence if—

(a) D sends a message,

(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and

(c) D’s act was intended to encourage or assist the infliction of serious physical harm.

(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.

(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.

(4) A person guilty of an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;

(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.

(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.

(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.

(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).

(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—

(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and

(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and

(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.””

This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.

New clause 17—Liability of directors for compliance failure

“(1) This section applies where OFCOM considers that there are reasonable grounds for believing that a provider of a regulated service has failed, or is failing, to comply with any enforceable requirement (see section 112) that applies in relation to the service.

(2) If OFCOM considers that the failure results from any—

(a) action,

(b) direction,

(c) neglect, or

(d) with the consent

This new clause would enable Ofcom to exercise its enforcement powers under Chapter 6, Part 7 of the Bill against individual directors, managers and other officers at a regulated service provider where it considers the provider has failed, or is failing, to comply with any enforceable requirement.

New clause 23—Financial support for victims support services

“(1) The Secretary of State must by regulations make provision for penalties paid under Chapter 6 to be used for funding for victims support services.

(2) Those regulations must—

(a) specify criteria setting out which victim support services are eligible for financial support under this provision;

(b) set out a means by which the amount of funding available should be determined;

(c) make provision for the funding to be reviewed and allocated on a three year basis.

(3) Regulations under this section—

(a) shall be made by statutory instrument, and

(b) may not be made unless a draft has been laid before and approved by resolution of each House of Parliament.”

New clause 28—Establishment of Advocacy Body

“(1) There is to be a body corporate (“the Advocacy Body”) to represent interests of child users of regulated services.

(2) A “child user”—

(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and

(b) includes both any existing child user and any future child user.

(3) The work of the Advocacy Body may include—

(a) representing the interests of child users;

(b) the protection and promotion of these interests;

(c) any other matter connected with those interests.

(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act, including—

(a) safety duties about illegal content, in particular CSEA content;

(b) safety duties protecting children;

(c) “enforceable requirements” relating to children.

(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.

(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.

(7) The Advocacy Body must assess emerging threats to child users of regulated services and must bring information regarding these threats to OFCOM.

(8) The Advocacy Body may undertake research on their own account.

(9) The Secretary of State must either appoint an organisation known to represent children to be designated the functions under this Act, or create an organisation to carry out the designated functions.

(10) The budget of the Advocacy Body will be subject to annual approval by the board of OFCOM.

(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 71).”

New clause 29—Duty to promote media literacy: regulated user-to-user services and search services

“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.

(2) This section applies only in relation to OFCOM’s duty to regulate—

(a) user-to-user services, and

(b) search services.

(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—

(a) to reach audiences who are less engaged with, and harder to reach through, traditional media literacy initiatives;

(b) to address gaps in the availability and accessibility of media literacy provisions targeted at vulnerable users;

(c) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;

(d) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by—

(i) carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public;

(ii) seeking to ensure, through the exercise of OFCOM’s online safety functions, that providers of regulated services take appropriate measures to improve users’ media literacy;

(iii) seeking to improve the evaluation of the effectiveness of the initiatives and measures mentioned in sub paras (2)(d)(i) and (ii) (including by increasing the availability and adequacy of data to make those evaluations);

(e) to promote better coordination within the media literacy sector.

(4) OFCOM may prepare such guidance about the matters referred to in subsection (2) as it considers appropriate.

(5) Where OFCOM prepares guidance under subsection (4) it must—

(a) publish the guidance (and any revised or replacement guidance); and

(b) keep the guidance under review.

(6) OFCOM must co-operate with the Secretary of State in the exercise and performance of their duty under this section.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 30—Media literacy strategy

“(1) OFCOM must prepare a strategy which sets out how they intend to undertake their duty to promote media literacy in relation to regulated user-to-user services and regulated search services under section (Duty to promote media literacy: regulated user-to-user services and search services).

(2) The strategy must—

(a) set out the steps OFCOM propose to take to achieve the pursuit of the objectives set out in section (Duty to promote media literacy: regulated user-to-user services and search services),

(b) set out the organisations, or types of organisations, that OFCOM propose to work with in undertaking the duty;

(c) explain why OFCOM considers that the steps it proposes to take will be effective;

(d) explain how OFCOM will assess the extent of the progress that is being made under the strategy.

(3) In preparing the strategy OFCOM must have regard to the need to allocate adequate resources for implementing the strategy.

(4) OFCOM must publish the strategy within the period of 6 months beginning with the day on which this section comes into force.

(5) Before publishing the strategy (or publishing a revised strategy), OFCOM must consult—

(a) persons with experience in or knowledge of the formulation, implementation and evaluation of policies and programmes intended to improve media literacy;

(b) the advisory committee on disinformation and misinformation, and

(c) any other person that OFCOM consider appropriate.

(6) If OFCOM have not revised the strategy within the period of 3 years beginning with the day on which the strategy was last published, they must either—

(a) revise the strategy, or

(b) publish an explanation of why they have decided not to revise it.

(7) If OFCOM decides to revise the strategy they must—

(a) consult in accordance with subsection (3), and

(b) publish the revised strategy.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 31—Research conducted by regulated services

“(1) OFCOM may, at any time it considers appropriate, produce a report into how regulated services commission, collate, publish and make use of research.

(2) For the purposes of the report, OFCOM may require services to submit to OFCOM—

(a) a specific piece of research held by the service, or

(b) all research the service holds on a topic specified by OFCOM.”

New clause 34—Factual Accuracy

“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.

(2) Any Regulated Service must provide an index of the historic factual accuracy of material published by each user who has—

(a) produced user-generated content,

(b) news publisher content, or

(c) comments and reviews on provider contact

(3) The index under subsection (1) must—

(a) satisfy minimum quality criteria to be set by OFCOM, and

(b) be displayed in a way which allows any user easily to reach an informed view of the likely factual accuracy of the content at the same time as they encounter it.”

New clause 35—Duty of balance

“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.

(2) Any Regulated Service which selects or prioritises particular—

(a) user-generated content,

(b) news publisher content, or

(c) comments and reviews on provider content

New clause 36—Identification of information incidents by OFCOM

“(1) OFCOM must maintain arrangements for identifying and understanding patterns in the presence and dissemination of harmful misinformation and disinformation on regulated services.

(2) Arrangements for the purposes of subsection (1) must in particular include arrangements for—

(a) identifying, and assessing the severity of, actual or potential information incidents; and

(b) consulting with persons with expertise in the identification, prevention and handling of disinformation and misinformation online (for the purposes of subsection (2)(a)).

(3) Where an actual or potential information incident is identified, OFCOM must as soon as reasonably practicable—

(a) set out any steps that OFCOM plans to take under its online safety functions in relation to that situation; and

(b) publish such recommendations or other information that OFCOM considers appropriate.

(4) Information under subsection (3) may be published in such a manner as appears to OFCOM to be appropriate for bringing it to the attention of the persons who, in OFCOM’s opinion, should be made aware of it.

(5) OFCOM must prepare and issue guidance about how it will exercise its functions under this section and, in particular—

(a) the matters it will take into account in determining whether an information incident has arisen;

(b) the matters it will take into account in determining the severity of an incident; and

(c) the types of responses that OFCOM thinks are likely to be appropriate when responding to an information incident.

(6) For the purposes of this section—

“harmful misinformation or disinformation” means misinformation or disinformation which, taking into account the manner and extent of its dissemination, may have a material adverse effect on users of regulated services or other members of the public;

“information incident” means a situation where it appears to OFCOM that there is a serious or systemic dissemination of harmful misinformation or disinformation relating to a particular event or situation.”

This new clause would insert a new clause into the Bill to give Ofcom a proactive role in identifying and responding to the sorts of information incidents that can occur in moments of crisis.

New clause 37—Duty to promote media literacy: regulated user-to-user services and search services

“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.

(2) This section applies only in relation to OFCOM’s duty to regulate—

(a) user-to-user services, and

(b) search services.

(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—

(a) to encourage the development and use of technologies and systems in relation to user-to-user services and search services which help to improve the media literacy of members of the public, including in particular technologies and systems which—

(i) indicate the nature of content on a service (for example, show where it is an advertisement);

(ii) indicate the reliability and accuracy of the content; and

(iii) facilitate control over what content is received;

(b) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;

(c) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public.

(4) OFCOM must prepare guidance about—

(a) the matters referred to in subsection (3) as it considers appropriate; and

(b) minimum standards that media literacy initiatives must meet.

(5) Where OFCOM prepares guidance under subsection (4) it must—

(a) publish the guidance (and any revised or replacement guidance); and

(b) keep the guidance under review.

(6) Every report under paragraph 12 of the Schedule to the Office of Communications Act 2002 (OFCOM’s annual report) for a financial year must contain a summary of the steps that OFCOM have taken under subsection (1) in that year.”

This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.

New clause 45—Sharing etc intimate photographs or film without consent

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) A person (A) does not commit an offence under this section if A shares a photograph or film of another person (B) with B or a third person (C) if—

(a) the photograph or film only shows activity that would be ordinarily seen on public street, except for a photograph or film of breastfeeding;

(b) the photograph or film was taken in public, where the person depicted was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;

(c) A reasonably believed that the photograph or film, taken in public, showed a person depicted who was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;

(d) the photograph or film has been previously shared with consent in public;

(e) A reasonably believed that the photograph or film had been previously shared with consent in public;

(f) the photograph or film shows a young child and is of a kind ordinarily shared by family and friends;

(g) the photograph or film is of a child shared for that child’s medical care or treatment, where there is parental consent.

(4) A person (A) does not commit an offence under this section if A shares information about where to access a photograph or film where this photograph or film has already been made available to A.

(5) It is a defence for a person charged with an offence under this section to prove that they—

(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime;

(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings;

(c) reasonably believed that the sharing was necessary for the administration of justice;

(d) reasonably believed that the sharing was necessary for a genuine medical, scientific or educational purpose; and

(e) reasonably believed that the sharing was in the public interest.

(6) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(7) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(8) “Photograph” includes the negative as well as the positive version.

(9) “Film” means a moving image.

(10) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(11) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(12) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding 6 months or a fine (or both).”

This new clause creates the offence of sharing an intimate image without consent, providing the necessary exclusions such as for children’s medical care or images taken in public places, and establishing the penalty as triable by magistrates only with maximum imprisonment of 6 months.

New clause 46—Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents; and

(c) A intends that the subject of the photograph or film will be caused alarm, distress or humiliation by the sharing of the photograph or film.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(4) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(5) “Photograph” includes the negative as well as the positive version.

(6) “Film” means a moving image.

(7) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(9) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 47—Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification

“(1) A person (A) commits an offence if—

(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and

(b) A does so—

(i) without B’s consent, and

(ii) without reasonably believing that B consents; and

(c) A shared the photograph or film for the purpose of obtaining sexual gratification (whether for the sender or recipient).

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(4) References to sharing such a photograph or film with another person include—

(a) sending it to another person by any means, electronically or otherwise;

(b) showing it to another person;

(c) placing it for another person to find; or

(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.

(5) “Photograph” includes the negative as well as the positive version.

(6) “Film” means a moving image.

(7) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(9) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 48—Threatening to share etc intimate photographs or film

“(1) A person (A) commits an offence if—

(a) A threatens to share an intimate photograph or film of another person (B) with B or a third person (C); and

(i) A intends B to fear that the threat will be carried out; or A is reckless as to whether B will fear that the threat will be carried out.

(2) “Threatening to share” should be read to include threatening to share an intimate photograph or film that does not exist and other circumstances where it is impossible for A to carry out the threat.

(3) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(4) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—

(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;

(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;

(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;

(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.

(5) References to sharing, or threatening to share, such a photograph or film with another person include—

(a) sending, or threatening to send, it to another person by any means, electronically or otherwise;

(b) showing, or threatening to show, it to another person;

(c) placing, or threatening to place, it for another person to find; or

(d) sharing, or threatening to share, it on or uploading it to a user-to-user service, including websites or online public forums.

(6) “Photograph” includes the negative as well as the positive version.

(7) “Film” means a moving image.

(8) References to a photograph or film include—

(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,

(b) an image which has been altered through computer graphics,

(c) a copy of a photograph, film or image, and

(d) data stored by any means which is capable of conversion into a photograph, film or image.

(9) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.

(10) A person who commits an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);

(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”

This new clause creates another more serious offence of threatening to share an intimate image, regardless of whether such an image actually exists, and where the sender intends to cause fear, or is reckless to whether they would cause fear, punishable by 12 months through a magistrates’ court or up to three years in a Crown Court.

New clause 49—Special measures in criminal proceedings for offences involving the sharing of intimate images

“(1) Chapter 1 of Part 2 of the Youth Justice and Criminal Evidence Act 1999 (giving of evidence or information for purposes of criminal proceedings: special measures directions in case of vulnerable and intimidated witnesses) is amended as follows.

(2) In section 17 (witnesses eligible for assistance on grounds of fear or distress about testifying), in subsection (4A) after paragraph (b) insert “(c) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”

This new clause inserts intimate image abuse into legislation that qualifies victims for special measures when testifying in court (such as partitions to hide them from view, video testifying etc.) which is already prescribed by law.

New clause 50—Anonymity for victims of offences involving the sharing of intimate images

“(1) Section 2 of the Sexual Offences (Amendment) Act 1992 (Offences to which this Act applies) is amended as follows.

(2) In subsection 1 after paragraph (db) insert—

(dc) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”

Similar to NC49, this new clause allows victims of intimate image abuse the same availability for anonymity as other sexual offences to protect their identities and give them the confidence to testify against their abuser without fear of repercussions.

New clause 54—Report on the effect of Virtual Private Networks on OFCOM’s ability to enforce requirements

“(1) The Secretary of State must publish a report on the effect of the use of Virtual Private Networks on OFCOM’s ability to enforce requirements under section 112.

(2) The report must be laid before Parliament within six months of the passing of this Act.”

New clause 55—Offence of sending communication facilitating modern slavery and illegal immigration

‘(1) A person (A) commits an offence if—

(a) (A) intentionally shares with a person (B) or with a third person (C) a photograph or film which is reasonably considered to be, or to be intended to be, facilitating or promoting any activities which do, or could reasonably be expected to, give rise to an offence under—

(i) sections 1 (Slavery, servitude and forced labour), 2 (Human trafficking) or 4 (Committing offence with intent to commit an offence under section 2) of the Modern Slavery Act 2015; or

(ii) sections 24 (Illegal Entry and Similar Offences) or 25 (Assisting unlawful immigration etc) of the Immigration Act 1971; and

(a) (A) does so knowing, or when they reasonably ought to have known, that the activities being depicted are unlawful.

(2) References to a third person (C) in this section are to be read as referring to—

(a) an individual;

(b) a group of individuals;

(c) a section of the public; or

(d) the public at large.

(3) A person (A) does not commit an offence under this section if—

(a) the sharing is undertaken by or on behalf of a journalist or for journalistic purposes;

(b) the sharing is by a refugee organisation registered in the UK and which falls within the scope of sub-section (3) or section 25A of the Immigration Act 1971;

(c) the sharing is by or on behalf of a duly elected Member of Parliament or other elected representative in the UK.

(4) It is a defence for a person charged under this section to provide that they—

(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime and

(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings.

(5) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both).”

This new clause would create a new criminal offence of intentionally sharing a photograph or film that facilitates or promotes modern slavery or illegal immigration.

Government amendments 234 and 102 to 117.

Amendment 195, in clause 104, page 87, line 10, leave out subsection 1 and insert—

“(1) If OFCOM consider that it is necessary and proportionate to do so, they may—

(a) give a notice described in subsection (2), (3) or (4) relating to a regulated user to user service or a regulated search service to the provider of the service;

(b) give a notice described in subsection (2), (3) or (4) to a provider or providers of Part 3 services taking into account risk profiles produced by OFCOM under section 84.”

Amendment 152, page 87, line 18, leave out ‘whether’.

This amendment is consequential on Amendment 153.

Amendment 153, page 87, line 19, leave out ‘or privately’.

This amendment removes the ability to monitor encrypted communications.

Government amendment 118.

Amendment 204, in clause 105, page 89, line 17, at end insert—

“(ia) the level of risk of the use of the specified technology accessing, retaining or disclosing the identity or provenance of any confidential journalistic source or confidential journalistic material.”

This amendment would require Ofcom to consider the risk of the use of accredited technology by a Part 3 service accessing, retaining or disclosing the identity or provenance of journalistic sources or confidential journalistic material, when deciding whether to give a notice under Clause 104(1) of the Bill.

Government amendments 119 to 130, 132 to 134, 212, 213, 135 and 214.

Amendment 23, in clause 130, page 114, line 3, leave out paragraph (a).

Government amendment 175.

Amendment 160, in clause 141, page 121, line 9, leave out subsection (2).

This amendment removes the bar of conditionality that must be met for super complaints that relate to a single regulated service.

Amendment 24, page 121, line 16, leave out “The Secretary of State” and insert “OFCOM”.

Amendment 25, page 121, line 21, leave out from “(3),” to end of line 24 and insert “OFCOM must consult—

“(a) The Secretary of State, and

“(b) such other persons as OFCOM considers appropriate.”

This amendment would provide that regulations under clause 141 are to be made by OFCOM rather than by the Secretary of State.

Amendment 189, in clause 142, page 121, line 45, leave out from “including” to end of line 46 and insert

“90 day maximum time limits in relation to the determination and notification to the complainant of—”.

This requires the Secretary of State’s guidance to require Ofcom to determine whether a complaint is eligible for the super-complaints procedure within 90 days.

Amendment 26, in clause 146, page 123, line 33, leave out

“give OFCOM a direction requiring”

and insert “may make representations to”.

Amendment 27, page 123, line 36, leave out subsection (2) and insert—

“(2) OFCOM must have due regard to any representations made by the Secretary of State under subsection (1).”

Amendment 28, page 123, line 38, leave out from “committee” to end of line 39 and insert

“established under this section is to consist of the following members—”.

Amendment 29, page 124, line ], leave out from “committee” to “publish” in line 2 and insert

“established under this section must”.

Amendment 30, page 124, line 4, leave out subsection (5).

Amendment 32, page 124, line 4, leave out clause 148.

Government amendments 176, 239, 138, 240, 215, 241, 242, 217, 218, 243, 219, 244, 245, 220, 221, 140, 246, 222 to 224, 247, 225, 248, 226 and 227.

Amendment 194, in clause 157, page 131, line 16, leave out from beginning to end of line 17 and insert—

“(a) B has not consented for A to send or give the photograph or film to B, and”.

Government amendments 249 to 252, 228, 229 and 235 to 237.

Government new schedule 2—Amendments of Part 4B of the Communications Act.

Government new schedule 3—Video-sharing platform services: transitional provision etc.

Government amendment 238

Amendment 35, schedule 11, page 198, line 5, leave out “The Secretary of State” and insert “OFCOM”.

This amendment would give the power to make regulations under Schedule 11 to OFCOM.

Amendment 2, page 198, line 9, leave out “functionalities” and insert “characteristics”.

Amendment 1, page 198, line 9, at end insert—

“(1A) In this schedule, “characteristics” of a service include its functionalities, user base, business model, governance and other systems and processes.”

Amendment 159, page 198, line 9, at end insert—

“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”

This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.

Amendment 36, page 198, line 10, leave out “The Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 37, page 198, line 16, leave out “The Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 3, page 198, line 2, leave out “functionalities” and insert “characteristics”.

Amendment 9, page 198, line 28, leave out “and” and insert “or”.

Amendment 4, page 198, line 29, leave out “functionality” and insert “characteristic”.

Amendment 38, page 198, line 32, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 5, page 198, line 34, leave out “functionalities” and insert “characteristics”.

Amendment 39, page 198, line 37, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 40, page 198, line 41, leave out “the Secretary of State” and insert “OFCOM”.

This amendment is consequential on Amendment 35.

Amendment 6, page 198, line 4, leave out “functionalities” and insert “characteristics”.

Amendment 7, page 199, line 11, leave out “functionalities” and insert “characteristics”.

Amendment 8, page 199, line 28, leave out “functionalities” and insert “characteristics”.

Amendment 41, page 199, line 3, leave out subparagraphs (5) to (11).

This amendment is consequential on Amendment 35.

Government amendments 230, 253 to 261 and 233.

Paul Scully Portrait Paul Scully
- Hansard - -

I was about to speak to the programme motion, Mr Speaker, but you have outlined exactly what I was going to say, so thank you for that—I am glad to get the process right.

I am delighted to bring the Online Safety Bill back to the House for the continuation of Report stage. I start by expressing my gratitude to colleagues across the House for their contributions to the Bill through pre-legislative scrutiny and before the summer recess, and for their engagement with me since I took office as the Minister for Tech and the Digital Economy.

The concept at the heart of this legislation is simple: tech companies, like those in every other sector, must take responsibility for the consequences of their business decisions. As they continue to offer users the latest innovations, they must consider the safety of their users as well as profit. They must treat their users fairly and ensure that the internet remains a place for free expression and robust debate. As Members will be aware, the majority of the Bill was discussed on Report before the summer recess. Our focus today is on the provisions that relate to the regulator’s power and the criminal law reforms. I will take this opportunity also to briefly set out the further changes that the Government recently committed to making later in the Bill’s passage.

Let me take the Government amendments in turn. The Government’s top priority for this legislation has always been the protection of children. We recognise that the particularly abhorrent and pernicious nature of online child sexual exploitation and abuse—CSEA—demands the most robust response possible. Throughout the passage of the Bill, we have heard evidence of the appalling harm that CSEA causes. Repeatedly, we heard calls for strong incentives for companies to do everything they can to innovate and make safety technologies their priority, to ensure that there is no place for offenders to hide online. The Bill already includes a specific power to tackle CSEA, which allows Ofcom, subject to safeguards, to require tech companies to use accredited technology to identify and remove illegal CSEA content in public and private communications. However, we have seen in recent years how the online world has evolved to allow offenders to reach their victims and one another in new ways.

Priti Patel Portrait Priti Patel (Witham) (Con)
- Hansard - - - Excerpts

I am listening to my hon. Friend with great interest on this aspect of child sexual abuse and exploitation, which is a heinous crime. Will he go on to speak about how the Ofcom role will interact with law enforcement, in particular the National Crime Agency, when dealing with these awful crimes?

Paul Scully Portrait Paul Scully
- Hansard - -

It is important that we tackle this in a number of ways. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and I spoke earlier, and I will come to some of what he will outline. It is important that Ofcom recognises the technologies that are available and—with the Children’s Commissioner as one of the statutory consultees—liaises with the social media platforms, and the agencies, to ensure that there are codes of practice that work, and that we get this absolutely right. It is about enforcing the terms and conditions of the companies and being able to produce the evidence and track the exchanges, as I will outline later, for the agency to use for enforcement.

With the rapid developments in technology, on occasions there will be no existing accredited technology available that will satisfactorily mitigate the risks. Similarly, tech companies might be able to better design solutions that integrate more easily with their services than those that are already accredited. The new regulatory framework must incentivise tech companies to ensure that their safety measures keep pace with the evolving threat, and that they design their services to be safe from the outset. It is for these reasons that the Government have tabled the amendments that we are discussing.

New clauses 11 and 12 establish options for Ofcom when deploying its powers under notices to deal with terrorism content and CSEA content. These notices will empower Ofcom to require companies to use accredited technology to identify and remove illegal terrorism and CSEA content or to prevent users from encountering that content or, crucially, to use their best endeavours to develop or to source technology to tackle CSEA. That strikes the right balance of supporting the adoption of new technology, while ensuring that it does not come at the expense of children’s physical safety.

Rehman Chishti Portrait Rehman Chishti (Gillingham and Rainham) (Con)
- Hansard - - - Excerpts

Terrorism is often linked to non-violent extremism, which feeds into violent extremism and terrorism. How does the Bill define extremism? Previous Governments failed to define it, although it is often linked to terrorism.

Paul Scully Portrait Paul Scully
- Hansard - -

This Bill links with other legislation, and obviously the agencies. We do not seek to redefine extremism where those definitions already exist. As we expand on the changes that we are making, we will first ensure that anything that is already illegal goes off the table. Anything that is against the terms and conditions of those platforms that are hosting that content must not be seen. I will come to the safety net and user protection later.

Charlotte Nichols Portrait Charlotte Nichols (Warrington North) (Lab)
- Hansard - - - Excerpts

Since Elon Musk’s takeover of Twitter, hate speech has ballooned on the platform and the number of staff members at Twitter identifying images of child sexual abuse and exploitation has halved. How can the Minister be sure that the social media companies are able to mark their own homework in the way that he suggests?

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

Because if those companies do not, they will get a fine of up to £18 million or 10% of their global turnover, whichever is higher. As we are finding with Twitter, there is also a commercial impetus, because advertisers are fleeing that platform as they see the uncertainty being caused by those changes. A lot of things are moving here to ensure that safety is paramount; it is not just for the Government to act in this area. All we are doing is making sure that those companies enforce their own terms and conditions.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

This point is important: we are speaking about terrorism and counter-terrorism and the state’s role in preventing terrorist activity. For clarity, will the Minister update the House later on the work that takes place between his Department and the platforms and, importantly, between the Home Office and the security services. In particular, some specialist work takes place with the Global Internet Forum to Counter Terrorism, which looks at online terrorist and extremist content. That work can ensure that crimes are prevented and that the right kinds of interventions take place.

Paul Scully Portrait Paul Scully
- Hansard - -

My right hon. Friend talks with experience from her time at the Home Office. She is absolutely right that the Bill sets a framework to adhere to the terms and conditions of the platforms. It also sets out the ability for the services to look at things such as terrorism and CSEA, which I have been talking about—for example, through the evidence of photos being exchanged. The Bill is not re-examining and re-prosecuting the interaction between all the agencies, however, because that is apparent for all to see.

New clauses 11 and 12 bring those powers in line with the wider safety duties by making it clear that the tools may seek to proactively prevent CSEA content from appearing on a service, rather than focusing only on identification and removal after the fact. That will ensure the best possible protection for children, including on services that offer livestreaming.

The safeguards around those powers remain as strong as before to protect user privacy. Any tools that are developed will be accredited using a rigorous assessment process to ensure that they are highly accurate before the company is asked to use them. That will avoid any unnecessary intrusions into user privacy by minimising the risk that the tools identify false positives.

Crucially, the powers do not represent a ban on or seek to undermine any specific type of technology or design, such as end-to-end encryption. They align with the UK Government’s view that online privacy and cyber-security must be protected, but that technological changes should not be implemented in a way that diminishes public safety.

Kit Malthouse Portrait Kit Malthouse (North West Hampshire) (Con)
- Hansard - - - Excerpts

Can the Minister expand on the notion of “accredited technology”? The definition in the Bill is pretty scant as to where it will emerge from. Is he essentially saying that he is relying on the same industry that has thus far presided over the problem to produce the technology that will police it for us? Within that equation, which seems a little self-defeating, is it the case that if the technology does not emerge for one reason or another—commercial or otherwise—the Government will step in and devise, fund or otherwise create the technology required to be implemented?

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

I thank my right hon. Friend. It is the technology sector that develops technology—it is a simple, circular definition—not the Government. We are looking to make sure that it has that technology in place, but if we prescribed it in the Bill, it would undoubtedly be out of date within months, never mind years. That is why it is better for us to have a rounded approach, working with the technology sector, to ensure that it is robust enough.

Kit Malthouse Portrait Kit Malthouse
- Hansard - - - Excerpts

I may not have been clear in my original intervention: my concern is that the legislation relies on the same sector that has thus far failed to regulate itself and failed to invent the technology that is required, even though it is probably perfectly capable of doing so, to produce the technology that we will then accredit to be used. My worry is that the sector, for one reason or another—the same reason that it has not moved with alacrity already to deal with these problems in the 15 years or so that it has existed—may not move at the speed that the Minister or the rest of us require to produce the technology for accreditation. What happens if it does not?

Paul Scully Portrait Paul Scully
- Hansard - -

Clearly, the Government can choose to step in. We are setting up a framework to ensure that we get the right balance and are not being prescriptive. I take issue with the idea that a lot of this stuff has not been invented, because there is some pretty robust work on age assurance and verification, and other measures to identify harmful and illegal material, although my right hon. Friend is right that it is not being used as robustly as it could be. That is exactly what we are addressing in the Bill.

David Davis Portrait Mr David Davis (Haltemprice and Howden) (Con)
- Hansard - - - Excerpts

My intervention is on the same point as that raised by my right hon. Friend the Member for North West Hampshire (Kit Malthouse), but from the opposite direction, in effect. What if it turns out that, as many security specialists and British leaders in security believe—not just the companies, but professors of security at Cambridge and that sort of thing—it is not possible to implement such measures without weakening encryption? What will the Minister’s Bill do then?

Paul Scully Portrait Paul Scully
- Hansard - -

The Bill is very specific with regard to encryption; this provision will cover solely CSEA and terrorism. It is important that we do not encroach on privacy.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

I welcome my hon. Friend to his position. Under the Bill, is it not the case that if a company refuses to use existing technologies, that will be a failure of the regulatory duties placed on that company? Companies will be required to demonstrate which technology they will use and will have to use one that is available. On encrypted messaging, is it not the case that companies already gather large amounts of information about websites that people visit before and after they send a message that could be hugely valuable to law enforcement?

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend is absolutely right. Not only is it incumbent on companies to use that technology should it exist; if they hamper Ofcom’s inquiries by not sharing information about what they are doing, what they find and which technologies they are not using, that will be a criminal liability under the Bill.

Luke Evans Portrait Dr Luke Evans (Bosworth) (Con)
- Hansard - - - Excerpts

To take that one step further, is it correct that Ofcom would set minimum standards for operators? For example, the Content Authenticity Initiative does not need primary legislation, but is an industry open-standard, open-source format. That is an example of modern technology that all companies could sign up to use, and Ofcom would therefore determine what needs to be done in primary legislation.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

Can I be helpful? We did say that our discussions should be within scope, but the Minister is tempting everybody to intervene out of scope. From his own point of view, I would have thought that it would be easier to keep within scope.

Paul Scully Portrait Paul Scully
- Hansard - -

Thank you, Mr Speaker; I will just respond to my hon. Friend the Member for Bosworth (Dr Evans). There is a minimum standard in so far as the operators have to adhere to the terms of the Bill. Our aim is to exclude illegal content and ensure that children are as safe as possible within the remit of the Bill.

The changes will ensure a flexible approach so that companies can use their expertise to develop or source the most effective solution for their service, rather than us being prescriptive. That, in turn, supports the continued growth of our digital economy while keeping our citizens safe online.

Sajid Javid Portrait Sajid Javid (Bromsgrove) (Con)
- Hansard - - - Excerpts

My hon. Friend may know that there are third-party technology companies—developers of this accredited technology, as he calls it—that do not have access to all the data that might be necessary to develop technology to block the kind of content we are discussing. They need to be given the right to access that data from the larger platforms. Will Ofcom be able to instruct large platforms that have users’ data to make it available to third-party developers of technology that can help to block such content?

Paul Scully Portrait Paul Scully
- Hansard - -

Ofcom will be working with the platforms over the next few months—in the lead-up to the commencement of the Bill and afterwards—to ensure that the provisions are operational, so that we get them up and running as soon as practicably possible. My right hon. Friend is right to raise the point.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

In Northern Ireland we face the specific issue of the glorification of terrorism. Glorifying terrorism encourages terrorism. Is it possible that the Bill will stop that type of glorification, and therefore stop the terrorism that comes off the back of it?

Paul Scully Portrait Paul Scully
- Hansard - -

I will try to cover the hon. Member’s comments a little bit later, if I may, when I talk about some of the changes coming up later in the process.

Moving away from CSEA, I am pleased to say that new clause 53 fulfils a commitment given by my predecessor in Committee to bring forward reforms to address epilepsy trolling. It creates the two specific offences of sending and showing flashing images to an individual with epilepsy with the intention of causing them harm. Those offences will apply in England, Wales and Northern Ireland, providing people with epilepsy with specific protection from this appalling abuse. I would like to place on record our thanks to the Epilepsy Society for working with the Ministry of Justice to develop the new clause.

The offence of sending flashing images captures situations in which an individual sends a communication in a scatter-gun manner—for example, by sharing a flashing image on social media—and the more targeted sending of flashing images to a person who the sender knows or suspects is a person with epilepsy. It can be committed by a person who forwards or shares such an electronic communication as well as by the person sending it. The separate offence of showing flashing images will apply if a person shows flashing images to someone they know or suspect to have epilepsy by means of an electronic communications device—for example, on a mobile phone or a TV screen.

The Government have listened to parliamentarians and stakeholders about the impact and consequences of this reprehensible behaviour, and my thanks go to my hon. Friends the Members for Watford (Dean Russell), for Stourbridge (Suzanne Webb), for Blackpool North and Cleveleys (Paul Maynard) and for Ipswich (Tom Hunt) for their work and campaigning. [Interruption.] Indeed, and the hon. Member for Batley and Spen (Kim Leadbeater), who I am sure will be speaking on this later.

New clause 53 creates offences that are legally robust and enforceable so that those seeking to cause harm to people with epilepsy will face appropriate criminal sanctions. I hope that will reassure the House that the deeply pernicious activity of epilepsy trolling will be punishable by law.

Suzanne Webb Portrait Suzanne Webb (Stourbridge) (Con)
- Hansard - - - Excerpts

The Minister is thanking lots of hon. Members, but should not the biggest thanks go, first, to the Government for the inclusion of this amendment; and secondly, to Zach Eagling, the inspirational now 11-year-old who was the victim of a series of trolling incidents when flashing images were pushed his way after a charity walk? We have a huge amount to thank Zach Eagling for, and of course the amazing Epilepsy Society too.

Paul Scully Portrait Paul Scully
- Hansard - -

A number of Members across the House have been pushing for Zach’s law, and I am really delighted that Zach’s family can see in Hansard that that campaigning has really made a direct change to the law.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

I just want to echo the previous points. This has been a hard-fought decision, and I am so proud that the Government have done this, but may I echo the thanks to Zach for being a true hero? We talk about David and Goliath, the giant—the beast—who was taken down, but Zach has beaten the tech giants, and I think this is an incredible success.

Paul Scully Portrait Paul Scully
- Hansard - -

I absolutely echo my hon. Friend’s remarks, and I again thank him for his work.

We are also taking steps to strengthen Ofcom’s enforcement powers, which is why we are giving Ofcom a discretionary power to require non-compliant services to publish or notify their users of enforcement action that it has taken against the service. Ofcom will be able to use this power to direct a service to publish details or notify its UK users about enforcement notices it receives from Ofcom. I thank the Antisemitism Policy Trust for bringing this proposal to our attention and for its helpful engagement on the issue. This new power will promote transparency by increasing awareness among users about breaches of the duty in the Bill. It will help users make much more informed decisions about the services they use, and act as an additional deterrent factor for service providers.

Luke Evans Portrait Dr Luke Evans
- Hansard - - - Excerpts

It is fantastic to have the data released. Does the Minister have any idea how many of these notifications are likely to be put out there when the Bill comes in? Has any work been done on that? Clearly, having thousands of these come out would be very difficult for the public to understand, but half a dozen over a year might be very useful to understand which companies are struggling.

Paul Scully Portrait Paul Scully
- Hansard - -

I think this is why Ofcom has discretion, so that it can determine that. The most egregious examples are the ones people can learn from, and it is about doing this in proportion. My hon. Friend is absolutely right that if we are swamped with small notifications, this will be hidden in plain sight. That would not be useful, particularly for parents, to best understand what is going on. It is all about making more informed decisions.

The House will be aware that we recently announced our intention to make a number of other changes to the Bill. We are making those changes because we believe it is vital that people can continue to express themselves freely and engage in pluralistic debate online. That is why the Bill will be amended to strengthen its provisions relating to children and to ensure that the Bill’s protections for adults strike the right balance with its protections for free speech.

Baroness Hodge of Barking Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

The Minister is alluding, I assume, to the legal but harmful provision, but what does he think about this as an example? People are clever; they do not use illegal language. They will not say, “I want to kill all Jews”, but they may well—and do—say, “I want to harm all globalists.” What is the Minister’s view of that?

Paul Scully Portrait Paul Scully
- Hansard - -

The right hon. Lady and I have had a detailed chat about some of the abuse that she and many others have been suffering, and there were some particularly egregious examples. This Bill is not, and never will be, a silver bullet. This has to be worked through, with the Government acting with media platforms and social media platforms, and parents also have a role. This will evolve, but we first need to get back to the fundamental point that social media platforms are not geared up to enforce their own terms and conditions. That is ridiculous, a quarter of a century after the world wide web kicked in, and when social media platforms have been around for the best part of 20 years. We are shutting the stable door afterwards, and trying to come up with legislation two decades later.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

Order. I am really bothered. I am trying to help the Minister, because although broadening discussion of the Bill is helpful, it is also allowing Members to come in with remarks that are out of scope. If we are going to go out of scope, we could be here a long time. I am trying to support the Minister by keeping him in scope.

Paul Scully Portrait Paul Scully
- Hansard - -

Thank you, Mr Speaker; I will try to keep my remarks very much in scope.

The harmful communications offence in clause 151 was a reform to communication offences proposed in the Bill. Since the Bill has been made public, parliamentarians and stakeholders have expressed concern that the threshold that would trigger prosecution for the offence of causing serious distress could bring robust but legitimate conversation into the illegal space. In the light of that concern, we have decided not to take forward the harmful communications offence for now. That will give the Government an opportunity to consider further how the criminal law can best protect individuals from harmful communications, and ensure that protections for free speech are robust.

Jim Shannon Portrait Jim Shannon
- Hansard - - - Excerpts

This is about the protection of young people, and we are all here for the same reason, including the Minister. We welcome the changes that he is putting forward, but the Royal College of Psychiatrists has expressed a real concern about the mental health of children, and particularly about how screen time affects them. NHS Digital has referred to one in eight 11 to 16-year-olds being bullied. I am not sure whether we see in the Bill an opportunity to protect them, so perhaps the Minister can tell me the right way to do that.

Paul Scully Portrait Paul Scully
- Hansard - -

The hon. Gentleman talks about the wider use of screens and screen time, and that is why Ofcom’s media literacy programme, and DCMS’s media literacy strategy—

Paul Scully Portrait Paul Scully
- Hansard - -

That is because we have a detailed strategy that tackles many of these issues. Again, none of this is perfect, and as I have said, the Government are working in tandem with the platforms, and with parents and education bodies, to make sure we get that bit right. The hon. Gentleman is right to highlight that as a big issue.

I talked about harmful communications, recognising that we could leave a potential gap in the criminal law. The Government have also decided not to repeal existing communications offences in the Malicious Communications Act 1988, or those under section 127(1) of the Communications Act 2003. That will ensure that victims of domestic abuse or other extremely harmful communications will still be robustly protected by the criminal law. Along with planned changes to the harmful communications offence, we are making a number of additional changes to the Bill—that will come later, Mr Speaker, and I will not tread too much into that, as it includes the removal of the adult safety duties, often referred to as the legal but harmful provision. The amended Bill offers adults a triple shield of protection that requires platforms to remove illegal content and material that violates their terms and conditions, and gives adults user controls to help them avoid seeing certain types of content.

The Bill’s key objective, above everything else, is the safety of children online, and we will be making a number of changes to strengthen the Bill’s existing protections for children. We will make sure that we expect platforms to use age assurance technology when identifying the age of their users, and we will also require platforms with minimum age restrictions to explain in their terms of service what measures they have in place to prevent access to those below their minimum age, and enforce those measures consistently. We are planning to name the Children’s Commissioner as a statutory consultee for Ofcom in its development of the codes of practice, ensuring that children’s views and needs are represented.

Paul Scully Portrait Paul Scully
- Hansard - -

That is the Children’s Commissioner for England, specifically because they have particular reserved duties for the whole of the UK. None the less, Ofcom must also have regard to a wider range of voices, which can easily include the other Children’s Commissioners.

Mike Amesbury Portrait Mike Amesbury (Weaver Vale) (Lab)
- Hansard - - - Excerpts

On age reassurance, does the Minister not see a weakness? Lots of children and young people are far more sophisticated than many of us in the Chamber and will easily find a workaround, as they do now. The onus is being put on the children, so the Bill is not increasing regulation or the safety of those children.

Paul Scully Portrait Paul Scully
- Hansard - -

As I said, the social media platforms will have to put in place robust age assurance and age verification for material in an accredited form that is acceptable to Ofcom, which will look at that.

Tackling violence against women and girls is a key priority for the Government. It is unacceptable that women and girls suffer disproportionately from abuse online, and it is right that we go further to address that through the Bill. That is why we will name the commissioner for victims and witnesses and the Domestic Abuse Commissioner as statutory consultees for the code of practice and list “coercive or controlling behaviour” as a priority offence. That offence disproportionately affects women and girls, and that measure will mean that companies will have to take proactive measures to tackle such content.

Finally, we are making a number of criminal law reforms, and I thank the Law Commission for the great deal of important work that it has done to assess the law in these areas.

Ruth Edwards Portrait Ruth Edwards (Rushcliffe) (Con)
- Hansard - - - Excerpts

I strongly welcome some of the ways in which the Bill has been strengthened to protect women and girls, particularly by criminalising cyber-flashing, for example. Does the Minister agree that it is vital that our laws keep pace with the changes in how technology is being used? Will he therefore assure me that the Government will look to introduce measures along the lines set out in new clauses 45 to 50, standing in the name of my right hon. Friend the Member for Basingstoke (Dame Maria Miller), who is leading fantastic work in this area, so that we can build on the Government’s record in outlawing revenge porn and threats to share it?

Paul Scully Portrait Paul Scully
- Hansard - -

I thank my hon. Friend, and indeed I thank my right hon. Friend the Member for Basingstoke (Dame Maria Miller) for the amazing work that she has done in this area. We will table an amendment to the Bill to criminalise more behaviour relating to intimate image abuse, so more perpetrators will face prosecution and potentially time in jail. My hon. Friend has worked tirelessly in this area, and we have had a number of conversations. I thank her for that. I look forward to more conversations to ensure that we get the amendment absolutely right and that it does exactly what we all want.

The changes we are making will include criminalising the non-consensual sharing of manufactured intimate images, which, as we have heard, are more commonly known as deepfakes. In the longer term, the Government will also take forward several of the Law Commission’s recommendations to ensure that the legislation is coherent and takes account of advancements in technology.

We will also use the Bill to bring forward a further communication offence to make the encouragement of self-harm illegal. We have listened to parliamentarians and stakeholders concerned about such behaviour and will use the Bill to criminalise that activity, providing users with protections from that harmful content. I commend my right hon. Friend the Member for Haltemprice and Howden on his work in this area and his advocacy for such a change.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

Intimate image abuse has been raised with me a number of times by younger constituents, who are particularly vulnerable to such abuse. Within the scope of what we are discussing, I am concerned that we have seen only one successful conviction for revenge porn, so if the Government base their intimate image work on the existing legislative framework for revenge porn, it will do nothing and protect no one, and will instead be a waste of everyone’s time and further let down victims who are already let down by the system.

Paul Scully Portrait Paul Scully
- Hansard - -

We will actually base that work on the independent Law Commission’s recommendations, and have been working with it on that basis.

Vicky Ford Portrait Vicky Ford (Chelmsford) (Con)
- Hansard - - - Excerpts

On images that promote self-harm, does the Minister agree that images that promote or glamourise eating disorders should be treated just as seriously as any other content promoting self-harm?

Paul Scully Portrait Paul Scully
- Hansard - -

I thank my right hon. Friend, who spoke incredibly powerfully at Digital, Culture, Media and Sport questions, and on a number of other occasions, about her particular experience. That is always incredibly difficult. Absolutely that area will be tackled, especially for children, but it is really important—as we will see from further changes in the Bill—that, with the removal of the legal but harmful protections, there are other protections for adults.

Sajid Javid Portrait Sajid Javid
- Hansard - - - Excerpts

I think last year over 6,000 people died from suicide in the UK. Much of that, sadly, was encouraged by online content, as we saw from the recent coroner’s report into the tragic death of Molly Russell. On new clause 16, tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), will the Minister confirm that the Government agree with the objectives of new clause 16 and will table an amendment to this Bill—to no other parliamentary vehicle, but specifically to this Bill—to introduce such a criminal offence? Will the Government amendment he referred to be published before year end?

Paul Scully Portrait Paul Scully
- Hansard - -

On self-harm, I do not think there is any doubt that we are absolutely aligned. On suicide, I have some concerns about how new clause 16 is drafted—it amends the Suicide Act 1961, which is not the right place to introduce measures on self-harm—but I will work to ensure we get this measure absolutely right as the Bill goes through the other place.

Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- Hansard - - - Excerpts

Will my hon. Friend give way?

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

Will my hon. Friend give way?

Paul Scully Portrait Paul Scully
- Hansard - -

I will give way first to one of my predecessors.

Caroline Dinenage Portrait Dame Caroline Dinenage
- Hansard - - - Excerpts

I thank my hon. Friend for giving way. He is almost being given stereo questions from across the House, but I think they might be slightly different. I am very grateful to him for setting out his commitment to tackling suicide and self-harm content, and for his commitment to my right hon. Friend the Member for Chelmsford (Vicky Ford) on eating disorder content. My concern is that there is a really opaque place in the online world between what is legal and illegal, which potentially could have been tackled by the legal but harmful restrictions. Can he set out a little more clearly—not necessarily now, but as we move forward—how we really are going to begin to tackle the opaque world between legal and illegal content?

Paul Scully Portrait Paul Scully
- Hansard - -

If my hon. Friend will bear with me—I need to make some progress—I think that will be teased out today and in Committee, should the Bill be recommitted, as we amend the clauses relating directly to what she is talking about, and then as the Bill goes through the other place.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

Will the Minister give way?

Paul Scully Portrait Paul Scully
- Hansard - -

I will give way a final time before I finish.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

I am grateful to the Minister, who has taken a number of interventions. I fully agree with my hon. Friend the Member for Gosport (Dame Caroline Dinenage). This is a grey area and has consistently been so—many Members have given their views on that in previous stages of the Bill. Will the Minister come back in the later stages on tackling violence against women and girls, and show how the Bill will incorporate key aspects of the Domestic Abuse Act 2021, and tie up with the criminal justice system and the work of the forthcoming victims Bill? We cannot look at these issues in isolation—I see that the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar) is also on the Front Bench. Rather, they all have to be put together in a golden thread of protecting victims, making sure that people do not become victims, and ensuring that we go after the perpetrators—we must not forget that at all. The Minister will not be able to answer that now, but I would ask him to please do so in the latter stages.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

I talked about the fact that the Commissioner for Victims and Witnesses and the Domestic Abuse Commissioner will be statutory consultees, because it is really important that their voice is heard in the implementation of the Bill. We are also bringing in coercive control as one of the areas. That is so important when it comes to domestic abuse. Domestic abuse does not start with a slap, a hit, a punch; it starts with emotional abuse—manipulation, coercion and so on. That is why coercive abuse is an important point not just for domestic abuse, but for bullying, harassment and the wider concerns that the Bill seeks to tackle.

Paul Scully Portrait Paul Scully
- Hansard - -

I will give way and then finish up.

Jamie Stone Portrait Jamie Stone
- Hansard - - - Excerpts

I am one of three Scottish Members present, and the Scottish context concerns me. If time permits me in my contribution later, I will touch on a particularly harrowing case. The school involved has been approached but has done nothing. Education is devolved, so the Minister may want to think about that. It would be too bad if the Bill failed in its good intentions because of a lack of communication in relation to a function delivered by the Scottish Government. Can I take it that there will be the closest possible co-operation with the Scottish Government because of their educational responsibilities?

Paul Scully Portrait Paul Scully
- Hansard - -

There simply has to be. These are global companies and we want to make the Bill work for the whole of the UK. This is not an England-only Bill, so the changes must happen for every user, whether they are in Scotland, Northern Ireland, Wales or England.

Paul Scully Portrait Paul Scully
- Hansard - -

I will make a bit of progress, because I am testing Mr Speaker’s patience.

We are making a number of technical amendments to ensure that the new communications offences are targeted and effective. New clause 52 seeks to narrow the exemptions for broadcast and wireless telegraphy licence holders and providers of on-demand programme services, so that the licence holder is exempt only to the extent that communication is within the course of a licensed activity. A separate group of technical amendments ensure that the definition of sending false and threatening communications will capture all circumstances—that is far wider than we have at the moment.

We propose a number of consequential amendments to relevant existing legislation to ensure that new offences operate consistently with the existing criminal law. We are also making a number of wider technical changes to strengthen the enforcement provisions and ensure consistency with other regulatory frameworks. New clause 42 ensures that Ofcom has the power to issue an enforcement notice to a former service provider, guarding against service providers simply shutting down their business and reappearing in a slightly different guise to avoid regulatory sanction. A package of Government amendments will set out how the existing video-sharing platform regime will be repealed and the transitional provisions that will apply to those providers as they transition to the online safety framework.

Finally, new clause 40 will enable the CMA to share information with Ofcom for the purpose of facilitating Ofcom’s online safety functions. That will help to ensure effective co-operation between Ofcom and the CMA.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

I thank my hon. Friend for giving way. In the past 40 minutes or so, he has demonstrated the complexity of the changes that are being proposed for the Bill, and he has done a very good job in setting that out. However, will he join me and many other right hon. and hon. Members who feel strongly that a Standing Committee should look at the Bill’s implementation, because of the complexities that he has so clearly demonstrated? I know that is a matter for the House rather than our consideration of the Bill, but I hope that other right hon. and hon. Members will join me in looking for ways to put that right. We need to be able to scrutinise the measures on an ongoing basis.

Paul Scully Portrait Paul Scully
- Hansard - -

Indeed, there will be, and are, review points in the Bill. I have no doubt that my right hon. Friend will raise that on other occasions as well.

I want to ensure that there is plenty of time for Members to debate the Bill at this important stage, and I have spoken for long enough. I appreciate the constructive and collaborative approach that colleagues have taken throughout the Bill’s passage.

Paul Scully Portrait Paul Scully
- Hansard - -

I will give way a final time.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

I am grateful to the Minister. Does he support Baroness Kidron’s amendment asking for swift, humane access to data where there is a suspicion that online information may have contributed to a child’s suicide? That has not happened in previous instances; does he support that important amendment?

Paul Scully Portrait Paul Scully
- Hansard - -

I am glad that I gave way so that the hon. Lady could raise that point. Baroness Kidron and her organisation have raised that issue with me directly, and they have gathered media support. We will look at that as the Bill goes through this place and the Lords, because we need to see what the powers are at the moment and why they are not working.

Now is the time to take this legislation forward to ensure that it can deliver the safe and transparent online environment that children and adults so clearly deserve.

Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

I call the shadow Minister.

--- Later in debate ---
Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I was not aware of that, but I am now. I thank my hon. Friend for that information. This is a crucial point. We need the accountability of the named director associated with the company, the platform and the product in order to introduce the necessary accountability. I do not know whether the Minister will accept this new clause today, but I very much hope that we will look further at how we can make this possible, perhaps in another place.

I very much support the Bill. We need to get it on the statute book, although it will probably need further work, and I support the Government amendments. However, given the link between children viewing pornography and child sexual abuse, I hope that when the Bill goes through the other place, their lordships will consider how regulations around pornographic content can be strengthened, in order to drastically reduce the number of children viewing porn and eventually being drawn into criminal activities themselves. In particular, I would like their lordships to look at tightening and accelerating the age verification and giving equal treatment to all pornography, whether it is on a porn site or a user-to-user service and whether it is online or offline. Porn is harmful to children in whatever form it comes, so the liability on directors and the criminality must be exactly the same. I support the Bill and the amendments in the Government’s name, but it needs to go further when it goes to the other place.

Paul Scully Portrait Paul Scully
- View Speech - Hansard - -

I thank Members for their contributions during today’s debate and for their ongoing engagement with such a crucial piece of legislation. I will try to respond to as many of the issues raised as possible.

My right hon. Friend the Member for Haltemprice and Howden (Mr Davis), who is not in his place, proposed adding in promoting self-harm as a criminal offence. The Government are sympathetic to the intention behind that proposal; indeed, we asked the Law Commission to consider how the criminal law might address that, and have agreed in principle to create a new offence of encouraging or assisting serious self-harm. The form of the offence recommended by the Law Commission is based on the broadly comparable offence of encouraging or assisting suicide. Like that offence, it covers the encouragement of, or assisting in, self-harm by means of communication and in other ways. When a similar amendment was tabled by the hon. Members for Ochil and South Perthshire (John Nicolson) and for Aberdeen North (Kirsty Blackman) in Committee, limiting the offence to encouragement or assistance by means of sending a message, the then Minister, my right hon. Friend the Member for Croydon South, said it would give only partial effect to the Law Commission’s recommendation. It remains the Government’s intention to give full effect to the Law Commission’s recommend-ations in due course.

--- Later in debate ---
William Cash Portrait Sir William Cash
- Hansard - - - Excerpts

I have raised this on a number of occasions in the past few hours, as have my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) and the right hon. Member for Barking (Dame Margaret Hodge). Will the Minister be good enough to ensure that this matter is thoroughly looked at and, furthermore, that the needed clarification is thought through?

Paul Scully Portrait Paul Scully
- Hansard - -

I was going to come to my hon. Friend in two seconds.

In the absence of clearly defined offences, the changes we are making to the Bill mean that it is likely to be almost impossible to take enforcement action against individuals. We are confident that Ofcom will have all the tools necessary to drive the necessary culture change in the sector, from the boardroom down.

This is not the last stage of the Bill. It will be considered in Committee—assuming it is recommitted today—and will come back on Report and Third Reading before going to the House of Lords, so there is plenty of time further to discuss this and to give my hon. Friend the clarification he needs.

Baroness Hodge of Barking Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

Is the Minister saying he is open to changing his view on why he is minded to reject new clause 17 tonight?

Paul Scully Portrait Paul Scully
- Hansard - -

I do not think I am changing my view. I am saying that this is not the last stage of the Bill, so there will be plenty of opportunity further to test this, should Members want to do so.

On new clause 28, the Government recognise and agree with the intent behind this amendment to ensure that the interests of child users of regulated services are represented. Protecting children online is the top priority in this Bill, and its key measures will ensure that children are protected from harmful content. The Bill appoints a regulator with comprehensive powers to force tech companies to keep children safe online, and the Bill’s provisions will ensure that Ofcom will listen and respond to the needs of children when identifying priority areas for regulatory action, setting out guidance for companies, taking enforcement action and responding to super-complaints.

Right from the outset, Ofcom must ensure that its risk assessment and priorities reflect the needs of children. For example, Ofcom is required to undertake research that will help understand emerging risks to child safety. We have heard a lot today about the emerging risks with changing technology, and it is important that we keep on top of those and have that children’s voice at the heart of this. The Bill also expands the scope of the Communications Consumer Panel to online safety matters. That independent panel of experts ensures that user needs are at the heart of Ofcom’s regulatory approach. Ofcom will also have the flexibility to choose other mechanisms to better understand user experiences and emerging threats. For example, it may set up user panels or focus groups.

Importantly, Ofcom will have to engage with expert bodies representing children when developing codes of practice and other regulatory guidance. For example, Ofcom will be required to consult persons who represent the interests of children when developing its codes of practice. That means that Ofcom’s codes will be fully informed by how children behave online, how they experience harm and what impact the proposed measures will have on their online experience. The super-complaints process will further enable independent bodies advocating for children to have their voices heard, and will help Ofcom to recognise and eliminate systemic failures.

As we have heard, the Government also plan to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it develops its code of practice. That amendment will be tabled in the House of Lords. Through this consultation, the commissioner will be able to flag systemic issues or issues of particular importance to the regulator, helping Ofcom to target investigations and, if necessary, sanctions at matters that most affect children’s online experience.

As such, there are ample opportunities in the framework for children’s voices to be heard, and the Government are not convinced of the need to legislate for another child user advocacy body. There are plenty of bodies out there that Ofcom will already be reaching out to and there is an abundance of experience in committed representative groups that are already engaged and will be engaged with the online safety framework. They include the existing statutory body responsible for promoting the interests of children, the Children’s Commissioner. Adding an additional statutory body would duplicate existing provision, creating a confusing landscape, and that would not be in the best interests of children.

Sarah Champion Portrait Sarah Champion
- Hansard - - - Excerpts

I hear what the Minister is saying about creating a statutory body, but will he assure this House that there is a specific vehicle for children’s voices to be heard in this? I ask because most of us here are not facing the daily traumas and constant recreation of different apps and social media ways to reach out to children that our children are. So unless we have their voice heard, this Bill is not going to be robust enough.

Paul Scully Portrait Paul Scully
- Hansard - -

As I say, we are putting the Children’s Commissioner as a statutory consultee in the Bill. Ofcom will also have to have regard to all these other organisations, such as the 5Rights Foundation and the NSPCC, that are already there. It is in the legislation that Ofcom will have to have regard to those advocates, but we are not specifically suggesting that there should be a separate body duplicating that work. These organisations are already out there and Ofcom will have to reach out to them when coming up with its codes of practice.

We also heard from my hon. Friend the Member for Dover (Mrs Elphicke) about new clause 55. She spoke powerfully and I commend her for all the work she is doing to tackle the small boats problem, which is affecting so many people up and down this country. I will continue to work closely with her as the Bill continues its passage, ahead of its consideration in the Lords, to ensure that this legislation delivers the desired impact on the important issues of illegal immigration and modern slavery. The legislation will give our law enforcement agencies and the social media companies the powers and guidance they need to stop the promotion of organised criminal activity on social media. Clearly, we have to act.

My right hon. Friend the Member for Witham (Priti Patel), who brings to bear her experience as a former Home Secretary, spoke eloquently about the need to have joined-up government, to make sure that lots of bits of legislation and all Departments are working on this space. This is a really good example of joined-up government, where we have to join together.

Natalie Elphicke Portrait Mrs Elphicke
- Hansard - - - Excerpts

Will the Minister confirm that, in line with the discussions that have been had, the Government will look to bring back amendments, should they be needed, in line with new clause 55 and perhaps schedule 7, as the Bill goes to the Lords or returns for further consideration in this House?

Paul Scully Portrait Paul Scully
- Hansard - -

All that I can confirm is that we will work with my hon. Friend and with colleagues in the Home Office to make sure that this legislation works in the way that she intends.

We share with my right hon. Friend the Member for Basingstoke (Dame Maria Miller) the concern about the abuse of deep fake images and the need to tackle the sharing of intimate images where the intent is wider than that covered by current offences. We have committed to bring forward Government amendments in the Lords to do just that, and I look forward to working with her to ensure that, again, we get that part of the legislation exactly right.

We also recognise the intent behind my right hon. Friend’s amendment to provide funding for victim support groups via the penalties paid by entities for failing to comply with the regulatory requirements. Victim and survivor support organisations play a critical role in providing support and tools to help people rebuild their lives. That is why the Government continue to make record investments in this area, increasing the funding for victim and witness support services to £192 million a year by 2024-25. We want to allow the victim support service to provide consistency for victims requiring support.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

I thank my hon. Friend for giving way and for his commitment to look at this matter before the Bill reaches the House of Lords. Can he just clarify to me that it is his intention to implement the Law Commission’s recommendations that are within the scope of the Bill prior to the Bill reaching the House of Lords? If that is the case, I am happy to withdraw my amendments.

Paul Scully Portrait Paul Scully
- Hansard - -

I cannot confirm today at what stage we will legislate. We will continue to work with my right hon. Friend and the Treasury to ensure that we get this exactly right. We will, of course, give due consideration to the Law Commission’s recommendations.

Maria Miller Portrait Dame Maria Miller
- Hansard - - - Excerpts

Unless I am mistaken, no other stages of the Bill will come before the House where this can be discussed. Either it will be done or it will not. I had hoped that the Minister would answer in the affirmative.

Paul Scully Portrait Paul Scully
- Hansard - -

I understand. We are ahead of the Lords on publication, so yes is the answer.

I have two very quick points for my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright). He was right to speak about acting with humility. We will bring forward amendments for recommittal to amend the approach for category 1 designation—not just the smaller companies that he was talking about, but companies that are pushing that barrier to get to category 1. I very much get his view that the process could be delayed unduly, and we want to make sure that we do not get the unintended consequences that he describes. I look forward to working with him to get the changes to the Bill to work exactly as he describes.

Finally, let me go back to the point that my right hon. Friend the Member for Haltemprice and Howden made about encrypted communications. We are not talking about banning end-to-end encryption or about breaking encryption—for the reasons set out about open banking and other areas. The amendment would leave Ofcom powerless to protect thousands of children and could leave unregulated spaces online for offenders to act, and we cannot therefore accept that.

John McDonnell Portrait John McDonnell
- Hansard - - - Excerpts

Just briefly, because I know that the Minister is about to finish, can he respond on amendment 204 with regard to the protection of journalists?

Paul Scully Portrait Paul Scully
- Hansard - -

I am happy to continue talking to the right hon. Gentleman, but I believe that we have enough protections in the Bill, with the human touch that we have added after the automatic flagging up of inquiries. The NCA will also have to have due regard to protecting sources. I will continue to work with him on that.

I have not covered everybody’s points, but this has been a very productive debate. I thank everyone for their contributions. We are really keen to get the Bill on the books and to act quickly to ensure that we can make children as safe as possible online.

Question put and agreed to.

New clause 11 accordingly read a Second time, and added to the Bill.

New Clause 12

Warning notices

‘(1) OFCOM may give a notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) to a provider relating to a service or part of a service only after giving a warning notice to the provider that they intend to give such a notice relating to that service or that part of it.

(2) A warning notice under subsection (1) relating to the use of accredited technology (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a) and (3)(a)) must—

(a) contain details of the technology that OFCOM are considering requiring the provider to use,

(b) specify whether the technology is to be required in relation to terrorism content or CSEA content (or both),

(c) specify any other requirements that OFCOM are considering imposing (see section 106(2) to (4)),

(d) specify the period for which OFCOM are considering imposing the requirements (see section 106(6)),

(e) state that the provider may make representations to OFCOM (with any supporting evidence), and

(f) specify the period within which representations may be made.

(3) A warning notice under subsection (1) relating to the development or sourcing of technology (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) and (3)(b)) must—

(a) describe the proposed purpose for which the technology must be developed or sourced (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a)(iii) and (iv) and (3)(a)(ii)),

(b) specify steps that OFCOM consider the provider needs to take in order to comply with the requirement described in section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) or (3)(b), or both those requirements (as the case may be),

(c) specify the proposed period within which the provider must take each of those steps,

(d) specify any other requirements that OFCOM are considering imposing,

(e) state that the provider may make representations to OFCOM (with any supporting evidence), and

(f) specify the period within which representations may be made.

(4) A notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) that relates to both the user-to-user part of a combined service and the search engine of the service (as described in section (Notices to deal with terrorism content or CSEA content (or both))(4)(c) or (d)) may be given to the provider of the service only if—

(a) two separate warning notices have been given to the provider (one relating to the user-to-user part of the service and the other relating to the search engine), or

(b) a single warning notice relating to both the user-to-user part of the service and the search engine has been given to the provider.

(5) A notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) may not be given to a provider until the period allowed by the warning notice for the provider to make representations has expired.’—(Paul Scully.)

This clause, which would follow NC11, also replaces part of existing clause 104. There are additions to the warning notice procedure to take account of the new options for notices under NC11.

Brought up, read the First and Second time, and added to the Bill.

New Clause 20

OFCOM’s reports about news publisher content and journalistic content

‘(1) OFCOM must produce and publish a report assessing the impact of the regulatory framework provided for in this Act on the availability and treatment of news publisher content and journalistic content on Category 1 services (and in this section, references to a report are to a report described in this subsection).

(2) Unless the Secretary of State requires the production of a further report (see subsection (6)), the requirement in subsection (1) is met by producing and publishing one report within the period of two years beginning with the day on which sections (Duties to protect news publisher content) and 16 come into force (or if those sections come into force on different days, the period of two years beginning with the later of those days).

(3) A report must, in particular, consider how effective the duties to protect such content set out in sections (Duties to protect news publisher content) and 16 are at protecting it.

(4) In preparing a report, OFCOM must consult—

(a) persons who represent recognised news publishers,

(b) persons who appear to OFCOM to represent creators of journalistic content,

(c) persons who appear to OFCOM to represent providers of Category 1 services, and

(d) such other persons as OFCOM consider appropriate.

(5) OFCOM must send a copy of a report to the Secretary of State, and the Secretary of State must lay it before Parliament.

(6) The Secretary of State may require OFCOM to produce and publish a further report if the Secretary of State considers that the regulatory framework provided for in this Act is, or may be, having a detrimental effect on the availability and treatment of news publisher content or journalistic content on Category 1 services.

(7) But such a requirement may not be imposed—

(a) within the period of three years beginning with the date on which the first report is published, or

(b) more frequently than once every three years.

(8) For further provision about reports under this section, see section 138.

(9) In this section—

“journalistic content” has the meaning given by section 16;

“news publisher content” has the meaning given by section 49;

“recognised news publisher” has the meaning given by section 50.

(10) For the meaning of “Category 1 service”, see section 82 (register of categories of services).’—(Paul Scully.)

This inserts a new clause (after clause 135) which requires Ofcom to publish a report on the impact of the regulatory framework provided for in the Bill within two years of the relevant provisions coming into force. It also allows the Secretary of State to require Ofcom to produce further reports.

Brought up, read the First and Second time, and added to the Bill.

New Clause 40

Amendment of Enterprise Act 2002

‘In Schedule 15 to the Enterprise Act 2002 (enactments relevant to provisions about disclosure of information), at the appropriate place insert—

‘Online Safety Act 2022.”’—(Paul Scully.)



This amendment has the effect that the information gateway in section 241 of the Enterprise Act 2002 allows disclosure of certain kinds of information by a public authority (such as the Competition and Markets Authority) to OFCOM for the purposes of OFCOM’s functions under this Bill.

Brought up, read the First and Second time, and added to the Bill.

New Clause 42

Former providers of regulated services

‘(1) A power conferred by Chapter 6 of Part 7 (enforcement powers) to give a notice to a provider of a regulated service is to be read as including power to give a notice to a person who was, at the relevant time, a provider of such a service but who has ceased to be a provider of such a service (and that Chapter and Schedules 13 and 15 are to be read accordingly).

(2) “The relevant time” means—

(a) the time of the failure to which the notice relates, or

(b) in the case of a notice which relates to the requirement in section 90(1) to co-operate with an investigation, the time of the failure or possible failure to which the investigation relates.’—(Paul Scully.)

This new clause, which is intended to be inserted after clause 162, provides that a notice that may be given under Chapter 6 of Part 7 to a provider of a regulated service may also be given to a former provider of a regulated service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 43

Amendments of Part 4B of the Communications Act

‘Schedule (Amendments of Part 4B of the Communications Act) contains amendments of Part 4B of the Communications Act.’—(Paul Scully.)

This new clause introduces a new Schedule amending Part 4B of the Communications Act 2003 (see NS2).

Brought up, read the First and Second time, and added to the Bill.

New Clause 44

Repeal of Part 4B of the Communications Act: transitional provision etc

‘(1) Schedule (Video-sharing platform services: transitional provision etc) contains transitional, transitory and saving provision—

(a) about the application of this Act and Part 4B of the Communications Act during a period before the repeal of Part 4B of the Communications Act (or, in the case of Part 3 of Schedule (Video-sharing platform services: transitional provision etc), in respect of charging years as mentioned in that Part);

(b) in connection with the repeal of Part 4B of the Communications Act.

(2) The Secretary of State may by regulations make transitional, transitory or saving provision of the kind mentioned in subsection (1)(a) and (b).

(3) Regulations under subsection (2) may amend or repeal—

(a) Part 2A of Schedule3;

(b) Schedule (Video-sharing platform services: transitional provision etc).

(4) Regulations under subsection (2) may, in particular, make provision about—

(a) the application of Schedule (Video-sharing platform services: transitional provision etc) in relation to a service if the transitional period in relation to that service ends on a date before the date when section 172 comes into force;

(b) the application of Part 3 of Schedule (Video-sharing platform services: transitional provision etc), including further provision about the calculation of a provider’s non-Part 4B qualifying worldwide revenue for the purposes of paragraph 19 of that Schedule;

(c) the application of Schedule 10 (recovery of OFCOM’s initial costs), and in particular how fees chargeable under that Schedule may be calculated, in respect of charging years to which Part 3 of Schedule (Video-sharing platform services: transitional provision etc) relates.’—(Paul Scully.)

This new clause introduces a new Schedule containing transitional provisions (see NS3), and provides a power for the Secretary of State to make regulations containing further transitional provisions etc.

Brought up, read the First and Second time, and added to the Bill.

New Clause 51

Publication by providers of details of enforcement action

‘(1) This section applies where—

(a) OFCOM have given a person (and not withdrawn) any of the following—

(i) a confirmation decision;

(ii) a penalty notice under section 119;

(iii) a penalty notice under section 120(5);

(iv) a penalty notice under section 121(6), and

(b) the appeal period in relation to the decision or notice has ended.

(2) OFCOM may give to the person a notice (a “publication notice”) requiring the person to—

(a) publish details describing—

(i) the failure (or failures) to which the decision or notice mentioned in subsection (1)(a) relates, and

(ii) OFCOM’s response, or

(b) otherwise notify users of the service to which the decision or notice mentioned in subsection (1)(a) relates of those details.

(3) A publication notice may require a person to publish details under subsection (2)(a) or give notification of details under subsection (2)(b) or both.

(4) A publication notice must—

(a) specify the decision or notice mentioned in subsection (1)(a) to which it relates,

(b) specify or describe the details that must be published or notified,

(c) specify the form and manner in which the details must be published or notified,

(d) specify a date by which the details must be published or notified, and

(e) contain information about the consequences of not complying with the notice.

(5) Where a publication notice requires a person to publish details under subsection (2)(a) the notice may also specify a period during which publication in the specified form and manner must continue.

(6) Where a publication notice requires a person to give notification of details under subsection (2)(b) the notice may only require that notification to be given to United Kingdom users of the service (see section 184).

(7) A publication notice may not require a person to publish or give notification of anything that, in OFCOM’s opinion—

(a) is confidential in accordance with subsections (8) and (9), or

(b) is otherwise not appropriate for publication or notification.

(8) A matter is confidential under this subsection if—

(a) it relates specifically to the affairs of a particular body, and

(b) publication or notification of that matter would or might, in OFCOM’s opinion, seriously and prejudicially affect the interests of that body.

(9) A matter is confidential under this subsection if—

(a) it relates to the private affairs of an individual, and

(b) publication or notification of that matter would or might, in OFCOM’s opinion, seriously and prejudicially affect the interests of that individual.

(10) A person to whom a publication notice is given has a duty to comply with it.

(11) The duty under subsection (10) is enforceable in civil proceedings by OFCOM—

(a) for an injunction,

(b) for specific performance of a statutory duty under section 45 of the Court of Session Act 1988, or

(c) for any other appropriate remedy or relief.

(12) For the purposes of subsection (1)(b) “the appeal period”, in relation to a decision or notice mentioned in subsection (1)(a), means—

(a) the period during which any appeal relating to the decision or notice may be made, or

(b) where such an appeal has been made, the period ending with the determination or withdrawal of that appeal.’—(Paul Scully.)

This new clause, which is intended to be inserted after clause 129, gives OFCOM the power to require a person to whom a confirmation decision or penalty notice has been given to publish details relating to the decision or notice or to otherwise notify service users of those details.

Brought up, read the First and Second time, and added to the Bill.

New Clause 52

Exemptions from offence under section 152

‘(1) A recognised news publisher cannot commit an offence under section 152.

(2) An offence under section 152 cannot be committed by the holder of a licence under the Broadcasting Act 1990 or 1996 in connection with anything done under the authority of the licence.

(3) An offence under section 152 cannot be committed by the holder of a multiplex licence in connection with anything done under the authority of the licence.

(4) An offence under section 152 cannot be committed by the provider of an on-demand programme service in connection with anything done in the course of providing such a service.

(5) An offence under section 152 cannot be committed in connection with the showing of a film made for cinema to members of the public.’—(Paul Scully.)

This new clause contains exemptions from the offence in clause 152 (false communications). The clause ensures that holders of certain licences are only exempt if they are acting as authorised by the licence and, in the case of Wireless Telegraphy Act licences, if they are providing a multiplex service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 53

Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2)

‘(1) A person (A) commits an offence if—

(a) A sends a communication by electronic means which consists of or includes flashing images (see subsection (13)),

(b) either condition 1 or condition 2 is met, and

(c) A has no reasonable excuse for sending the communication.

(2) Condition 1 is that—

(a) at the time the communication is sent, it is reasonably foreseeable that an individual with epilepsy would be among the individuals who would view it, and

(b) A sends the communication with the intention that such an individual will suffer harm as a result of viewing the flashing images.

(3) Condition 2 is that, when sending the communication—

(a) A believes that an individual (B)—

(i) whom A knows to be an individual with epilepsy, or

(ii) whom A suspects to be an individual with epilepsy,

will, or might, view it, and

(b) A intends that B will suffer harm as a result of viewing the flashing images.

(4) In subsections (2)(a) and (3)(a), references to viewing the communication are to be read as including references to viewing a subsequent communication forwarding or sharing the content of the communication.

(5) The exemptions contained in section (Exemptions from offence under section 152) apply to an offence under subsection (1) as they apply to an offence under section 152.

(6) For the purposes of subsection (1), a provider of an internet service by means of which a communication is sent is not to be regarded as a person who sends a communication.

(7) In the application of subsection (1) to a communication consisting of or including a hyperlink to other content, references to the communication are to be read as including references to content accessed directly via the hyperlink.

(8) A person (A) commits an offence if—

(a) A shows an individual (B) flashing images by means of an electronic communications device,

(b) when showing the images—

(i) A knows that B is an individual with epilepsy, or

(ii) A suspects that B is an individual with epilepsy,

(c) when showing the images, A intends that B will suffer harm as a result of viewing them, and

(d) A has no reasonable excuse for showing the images.

(9) An offence under subsection (1) or (8) cannot be committed by a healthcare professional acting in that capacity.

(10) A person who commits an offence under subsection (1) or (8) is liable—

(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);

(b) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding six months or a fine not exceeding the statutory maximum (or both);

(c) on conviction on indictment, to imprisonment for a term not exceeding five years or a fine (or both).

(11) It does not matter for the purposes of this section whether flashing images may be viewed at once (for example, a GIF that plays automatically) or only after some action is performed (for example, pressing play).

(12) In this section—

(a) references to sending a communication include references to causing a communication to be sent;

(b) references to showing flashing images include references to causing flashing images to be shown.

(13) In this section—

“electronic communications device” means equipment or a device that is capable of transmitting images by electronic means;

“flashing images” means images which carry a risk that an individual with photosensitive epilepsy who viewed them would suffer a seizure as a result;

“harm” means—

(a) a seizure, or

(b) alarm or distress;

“individual with epilepsy” includes, but is not limited to, an individual with photosensitive epilepsy;

“send” includes transmit and publish (and related expressions are to be read accordingly).

(14) This section extends to England and Wales and Northern Ireland.’—(Paul Scully.)

This new clause creates (for England and Wales and Northern Ireland) a new offence of what is sometimes known as “epilepsy trolling” - sending or showing flashing images electronically to people with epilepsy intending to cause them harm.

Brought up, read the First and Second time, and added to the Bill.

New Clause 16

Communication offence for encouraging or assisting self-harm

‘(1) In the Suicide Act 1961, after section 3 insert—

“3A Communication offence for encouraging or assisting self-harm

(1) A person (“D”) commits an offence if—

(a) D sends a message,

(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and

(c) D’s act was intended to encourage or assist the infliction of serious physical harm.

(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.

(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.

(4) A person guilty of an offence under this section is liable—

(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;

(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.

(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.

(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.

(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).

(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—

(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and

(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and

(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.”’—(Mr Davis.)

This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.

Brought up, and read the First time.

Question put, That the clause be read a Second time.

ONLINE SAFETY BILL (First sitting)

Paul Scully Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Of course, Sir Roger. Without addressing the other amendments, I would like us to move away from the overly content-focused approach that the Government seem intent on taking in the Bill more widely. I will leave my comments there on the SNP amendment, but we support our SNP colleagues on it.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - -

It is a pleasure to serve under your chairmanship, Sir Roger.

Being online can be a hugely positive experience for children and young people, but we recognise the challenge of habit-forming behaviour or designed addiction to some digital services. The Bill as drafted, however, would already deliver the intent of the amendment from the hon. Member for Aberdeen North. If service providers identify in their risk assessment that habit-forming or addictive-behaviour risks cause significant harm to an appreciable number of children on a service, the Bill will require them to put in place measures to mitigate and manage that risk under clause 11(2)(a).

To meet the child safety risk assessment duties under clause 10, services must assess the risk of harm to children from the different ways in which the service is used; the impact of such use; the level of risk of harm to children; how the design and operation of the service may increase the risks identified; and the functionalities that facilitate the presence or dissemination of content of harm to children. The definition of “functionality” at clause 200 already includes an expression of a view on content, such as applying a “like” or “dislike” button, as at subsection (2)(f)(i).

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I thank the Minister for giving way so early on. He mentioned an “appreciable number”. Will he clarify what that is? Is it one, 10, 100 or 1,000?

Paul Scully Portrait Paul Scully
- Hansard - -

I do not think that a single number can be put on that, because it depends on the platform and the type of viewing. It is not easy to put a single number on that. An “appreciable number” is basically as identified by Ofcom, which will be the arbiter of all this. It comes back to what the hon. Member for Aberdeen North said about the direction that we, as she rightly said, want to give Ofcom. Ofcom has a range of powers already to help it assess whether companies are fulfilling their duties, including the power to require information about the operation of their algorithms. I would set the direction that the hon. Lady is looking for, to ensure that Ofcom uses those powers to the fullest and can look at the algorithms. We should bear in mind that social media platforms face criminal liability if they do not supply the information required by Ofcom to look under the bonnet.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If platforms do not recognise that they have an issue with habit-forming features, even though we know they have, will Ofcom say to them, “Your risk assessment is insufficient. We know that the habit-forming features are really causing a problem for children”?

Paul Scully Portrait Paul Scully
- Hansard - -

We do not want to wait for the Bill’s implementation to start those conversations with the platforms. We expect companies to be transparent about their design practices that encourage extended engagement and to engage with researchers to understand the impact of those practices on their users.

The child safety duties in clause 11 apply across all areas of a service, including the way it is operated and used by children and the content present on the service. Subsection (4)(b) specifically requires services to consider the

“design of functionalities, algorithms and other features”

when complying with the child safety duties. Given the direction I have suggested that Ofcom has, and the range of powers that it will already have under the Bill, I am unable to accept the hon. Member’s amendment, and I hope she will therefore withdraw it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I would have preferred it had the Minister been slightly more explicit that habit-forming features are harmful. That would have been slightly more helpful.

Paul Scully Portrait Paul Scully
- Hansard - -

I will say that habit-forming features can be harmful.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister. Absolutely—they are not always harmful. With that clarification, I am happy to beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Paul Scully Portrait Paul Scully
- Hansard - -

I beg to move amendment 1, in clause 11, page 10, line 22, leave out

“, or another means of age assurance”.

This amendment omits words which are no longer necessary in subsection (3)(a) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 2 and 3.

Paul Scully Portrait Paul Scully
- Hansard - -

The Bill’s key objective, above everything else, is the safety of young people online. That is why the strongest protections in the Bill are for children. Providers of services that are likely to be accessed by children will need to provide safety measures to protect child users from harmful content, such as pornography, and from behaviour such as bullying. We expect companies to use age verification technologies to prevent children from accessing services that pose the highest risk of harm to them, and age assurance technologies and other measures to provide children with an age-appropriate experience.

The previous version of the Bill already focused on protecting children, but the Government are clear that the Bill must do more to achieve that and to ensure that the requirements on providers are as clear as possible. That is why we are strengthening the Bill and clarifying the responsibilities of providers to provide age-appropriate protections for children online. We are making it explicit that providers may need to use age assurance to identify the age of their users in order to meet the child safety duties for user-to-user services.

The Bill already set out that age assurance may be required to protect children from harmful content and activity, as part of meeting the duty in clause 11(3), but the Bill will now clarify that it may also be needed to meet the wider duty in subsection (2) to

“mitigate and manage the risks of harm to children”

and to manage

“the impact of harm to children”

on such services. That is important so that only children who are old enough are able to use functionalities on a service that poses a risk of harm to younger children. The changes will also ensure that children are signposted to support that is appropriate to their age if they have experienced harm. For those reasons, I recommend that the Committee accepts the amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I have a few questions regarding amendments 1 to 3, which as I mentioned relate to the thorny issue of age verification and age assurance, and I hope the Minister can clarify some of them.

We are unclear about why, in subsection (3)(a), the Government have retained the phrase

“for example, by using age verification, or another means of age assurance”.

Can that difference in wording be taken as confirmation that the Government want harder forms of age verification for primary priority content? The Minister will be aware that many in the sector are unclear about what that harder form of age verification may look like, so some clarity would be useful for all of us in the room and for those watching.

In addition, we would like to clarify the Minister’s understanding of the distinction between age verification and age assurance. They are very different concepts in reality, so we would appreciate it if he could be clear, particularly when we consider the different types of harm that the Bill will address and protect people from, how that will be achieved and what technology will be required for different types of platform and content. I look forward to clarity from the Minister on that point.

Paul Scully Portrait Paul Scully
- Hansard - -

That is a good point. In essence, age verification is the hard access to a service. Age assurance ensures that the person who uses the service is the same person whose age was verified. Someone could use their parent’s debit card or something like that, so it is not necessarily the same person using the service right the way through. If we are to protect children, in particular, we have to ensure that we know there is a child at the other end whom we can protect from the harm that they may see.

On the different technologies, we are clear that our approach to age assurance or verification is not technology-specific. Why? Because otherwise the Bill would be out of date within around six months. By the time the legislation was fully implemented it would clearly be out of date. That is why it is incumbent on the companies to be clear about the technology and processes they use. That information will be kept up to date, and Ofcom can then look at it.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On these platforms, the age verification requirements are clear: they must age-gate the adult content or get rid of it. They must do one or the other. Rightly, the Bill does not specify technologies. Technologies are available. The point is that a company must demonstrate that it is using an existing and available technology or that it has some other policy in place to remedy the issue. It has a choice, but it cannot do nothing. It cannot say that it does not have a policy on it.

Age assurance is always more difficult for children, because they do not have the same sort of ID that adults have. However, technologies exist: for instance, Yoti uses facial scanning. Companies do not have to do that either; they have to demonstrate that they do something beyond self-certification at the point of signing up. That is right. Companies may also demonstrate what they do to take robust action to close the accounts of children they have identified on their platforms.

If a company’s terms of service state that people must be 13 or over to use the platform, the company is inherently stating that the platform is not safe for someone under 13. What does it do to identify people who sign up? What does it do to identify people once they are on the platform, and what action does it then take? The Bill gives Ofcom the powers to understand those things and to force a change of behaviour and action. That is why—to the point made by the hon. Member for Pontypridd—age assurance is a slightly broader term, but companies can still extract a lot of information to determine the likely age of a child and take the appropriate action.

Paul Scully Portrait Paul Scully
- Hansard - -

I think we are all in agreement, and I hope that the Committee will accept the amendments.

Amendment 1 agreed to.

Amendments made: 2, in clause 11, page 10, line 25, leave out

“(for example, by using age assurance)”.

This amendment omits words which are no longer necessary in subsection (3)(b) of clause 11 because they are dealt with by the new subsection inserted by Amendment 3.

Amendment 3, in clause 11, page 10, line 26, at end insert—

“(3A) Age assurance to identify who is a child user or which age group a child user is in is an example of a measure which may be taken or used (among others) for the purpose of compliance with a duty set out in subsection (2) or (3).”—(Paul Scully.)

This amendment makes it clear that age assurance measures may be used to comply with duties in clause 11(2) as well as (3) (safety duties protecting children).

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 99, in clause 11, page 10, line 34, leave out paragraph (d) and insert—

“(d) policies on user access to the service, parts of the service, or to particular content present on the service, including blocking users from accessing the service, parts of the service, or particular content,”.

This amendment is intended to make clear that if it is proportionate to do so, services should have policies that include blocking access to parts of a service, rather than just the entire service or particular content on the service.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

If someone on a PlayStation wants to play online games, they must sign up to PlayStation Plus—that is how the model works. Once they pay that subscription, they can access online games and play Fortnite or Rocket League or whatever they want online. They then also have access to a suite of communication features; they can private message people. It would be disproportionate to ban somebody from playing any PlayStation game online in order to stop them from being able to private message inappropriate things. That would be a disproportionate step. I do not want PlayStation to be unable to act against somebody because it could not ban them, as that would be disproportionate, but was unable to switch off the direct messaging features because the clause does not allow it that flexibility. A person could continue to be in danger on the PlayStation platform as a result of private communications that they could receive. That is one example of how the provision would be key and important.

Paul Scully Portrait Paul Scully
- Hansard - -

Again, the Government recognise the intent behind amendment 99, which, as the hon. Member for Aberdeen North said, would require providers to be able to block children’s access to parts of a service, rather than the entire service. I very much get that. We recognise the nature and scale of the harm that can be caused to children through livestreaming and private messaging, as has been outlined, but the Bill already delivers what is intended by these amendments. Clause 11(4) sets out examples of areas in which providers will need to take measures, if proportionate, to meet the child safety duties. It is not an exhaustive list of every measure that a provider might be required to take. It would not be feasible or practical to list every type of measure that a provider could take to protect children from harm, because such a list could become out of date quickly as new technologies emerge, as the hon. Lady outlined with her PlayStation example.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I have a concern. The Minister’s phrasing was “to block children’s access”. Surely some of the issues would be around blocking adults’ access, because they are the ones causing risk to the children. From my reading of the clause, it does not suggest that the action could be taken only against child users; it could be taken against any user in order to protect children.

Paul Scully Portrait Paul Scully
- Hansard - -

I will come to that in a second. The hon. Member for Luton North talked about putting the onus on the victim. Any element of choice is there for adults; the children will be protected anyway, as I will outline in a second. We all agree that the primary purpose of the Bill is to be a children’s protection measure.

Ofcom will set out in codes of practice the specific steps that providers can take to protect children who are using their service, and the Government expect those to include steps relating to children’s access to high-risk features, such as livestreaming or private messaging. Clause 11(4)(d) sets out that that providers may be required to take measures in the following areas:

“policies on user access to the service or to particular content present on the service, including blocking users from accessing the service or particular content”.

The other areas listed are intentionally broad categories that allow for providers to take specific measures. For example, a measure in the area of blocking user access to particular content could include specific measures that restrict children’s access to parts of a service, if that is a proportionate way to stop users accessing that type of content. It can also apply to any of the features of a service that enable children to access particular content, and could therefore include children’s access to livestreaming and private messaging features. In addition, the child safety duties make it clear that providers need to use proportionate systems and processes that prevent children from encountering primary priority content that is harmful to them, and protect children and age groups at risk of harm from other content that is harmful to them.

While Ofcom will set out in codes of practice the steps that providers can take to meet these duties, we expect those steps, as we have heard, to include the use of age verification to prevent children accessing content that poses the greatest risk of harm to them. To meet that duty, providers may use measures that restrict children from accessing parts of the service. The Bill therefore allows Ofcom to require providers to take that step where it is proportionate. I hope that that satisfies the hon. Member for Aberdeen North, and gives her the direction that she asked for—that is, a direction to be more specific that Ofcom does indeed have the powers that she seeks.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

The Bill states that we can expect little impact on child protection before 2027-28 because of the enforcement road map and when Ofcom is planning to set that out. Does the Minister not think that in the meantime, that sort of ministerial direction would be helpful? It could make Ofcom’s job easier, and would mean that children could be protected online before 2027-28.

Paul Scully Portrait Paul Scully
- Hansard - -

The ministerial direction that the various platforms are receiving from the Dispatch Box, from our conversations with them and from the Bill’s progress as it goes through the House of Lords will be helpful to them. We do not expect providers to wait until the very last minute to implement the measures. They are starting to do so now, but we want them to go them further, quicker.

Government amendment 4 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details of the measures that they use to restrict access in their terms of service and apply them consistently. Providers will also need to provide age-appropriate protections for children using their service. That includes protecting children from harmful content and activity on their service, as well as reviewing children’s use of higher-risk features, as I have said.

To meet the child safety risk assessment duties in clause 10, providers must assess: the risk of harm to children from functionalities that facilitate the presence or dissemination of harmful content; the level of risk from different kinds of harmful content, giving separate consideration to children in different age groups; the different ways in which the service is used, and the impact of such use on the level of risk of harm; and how the design and operation of the service may increase the risks identified.

The child safety duties in clause 11 apply across all areas of the service, including the way it is operated and used by children, as well as the content present on the service. For the reasons I have set out, I am not able to accept the amendments, but I hope that the hon. Member for Aberdeen North will take on board my assurances.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That was quite helpful. I am slightly concerned about the Minister’s focus on reducing children’s access to the service or to parts of it. I appreciate that is part of what the clause is intended to do, but I would also expect platforms to be able to reduce the ability of adults to access parts of the service or content in order to protect children. Rather than just blocking children, blocking adults from accessing some features—whether that is certain adults or adults as a group—would indeed protect children. My reading of clause 11(4) was that users could be prevented from accessing some of this stuff, rather than just child users. Although the Minister has given me more questions, I do not intend to push the amendment to a vote.

May I ask a question of you, Sir Roger? I have not spoken about clause stand part. Are we still planning to have a clause stand part debate?

--- Later in debate ---
None Portrait The Chair
- Hansard -

That is up to the Committee.

Amendment, by leave, withdrawn.

Paul Scully Portrait Paul Scully
- Hansard - -

I beg to move amendment 4, in clause 11, page 11, line 9, at end insert—

“(6A) If a provider takes or uses a measure designed to prevent access to the whole of the service or a part of the service by children under a certain age, a duty to—

(a) include provisions in the terms of service specifying details about the operation of the measure, and

(b) apply those provisions consistently.”

This amendment requires providers to give details in their terms of service about any measures they use which prevent access to a service (or part of it) by children under a certain age, and to apply those terms consistently.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendment 5.

Amendment 100, in clause 11, page 11, line 15, after “accessible” insert “for child users.”

This amendment makes clear that the provisions of the terms of service have to be clear and accessible for child users.

Paul Scully Portrait Paul Scully
- Hansard - -

Although the previous version of the Bill already focused on protecting children, as I have said, the Government are clear that it must do more to achieve that and to ensure that requirements for providers are as clear as possible. That is why we are making changes to strengthen the Bill. Amendments 4 and 5 will require providers who already have a minimum age requirement for access to their service, or parts of it, to give details in their terms of services of the measures that they use to ensure that children below the minimum age are prevented access. Those terms must be applied consistently and be clear and accessible to users. The change will mean that providers can be held to account for what they say in their terms of service, and will no longer do nothing to prevent underage access.

The Government recognise the intent behind amendment 100, which is to ensure that terms of service are clear and accessible for child users, but the Bill as drafted sets an appropriate standard for terms of service. The duty in clause 11(8) sets an objective standard for terms of service to be clear and accessible, rather than requiring them to be clear for particular users. Ofcom will produce codes of practice setting out how providers can meet that duty, which may include provisions about how to tailor the terms of service to the user base where appropriate.

The amendment would have the unintended consequence of limiting to children the current accessibility requirement for terms of service. As a result, any complicated and detailed information that would not be accessible for children—for example, how the provider uses proactive technology—would probably need to be left out of the terms of service, which would clearly conflict with the duty in clause 11(7) and other duties relating to the terms of service. It is more appropriate to have an objective standard of “clear and accessible” so that the terms of service can be tailored to provide the necessary level of information and be useful to other users such as parents and guardians, who are most likely to be able to engage with the more detailed information included in the terms of service and are involved in monitoring children’s online activities.

Ofcom will set out steps that providers can take to meet the duty and will have a tough suite of enforcement powers to take action against companies that do not meet their child safety duties, including if their terms of service are not clear and accessible. For the reasons I have set out, I am not able to accept the amendment tabled by the hon. Member for Aberdeen North and I hope she will withdraw it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

As I said, I will also talk about clause 11. I can understand why the Government are moving their amendments. It makes sense, particularly with things like complying with the provisions. I have had concerns all the way along—particularly acute now as we are back in Committee with a slightly different Bill from the one that we were first presented with—about the reliance on terms of service. There is a major issue with choosing to go down that route, given that providers of services can choose what to put in their terms of service. They can choose to have very good terms of service that mean that they will take action on anything that is potentially an issue and that will be strong enough to allow them to take the actions they need to take to apply proportionate measures to ban users that are breaking the terms of service. Providers will have the ability to write terms of service like that, but not all providers will choose to do that. Not all providers will choose to write the gold standard terms of service that the Minister expects everybody will write.

We have to remember that these companies’ and organisations’ No. 1 aim is not to protect children. If their No. 1 aim was to protect children, we would not be here. We would not need an Online Safety Bill because they would be putting protection front and centre of every decision they make. Their No. 1 aim is to increase the number of users so that they can get more money. That is the aim. They are companies that have a duty to their shareholders. They are trying to make money. That is the intention. They will not therefore necessarily draw up the best possible terms of service.

I heard an argument on Report that market forces will mean that companies that do not have strong enough terms of service, companies that have inherent risks in their platforms, will just not be used by people. If that were true, we would not be in the current situation. Instead, the platforms that are damaging people and causing harm—4chan, KiwiFarms or any of those places that cause horrendous difficulties—would not be used by people because market forces would have intervened. That approach does not work; it does not happen that the market will regulate itself and people will stay away from places that cause them or others harm. That is not how it works. I am concerned about the reliance on terms of service and requiring companies to stick to their own terms of service. They might stick to their own terms of service, but those terms of service might be utterly rubbish and might not protect people. Companies might not have in place what we need to ensure that children and adults are protected online.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

It is absolutely the case that those companies still have to do a risk assessment, and a child risk assessment if they meet the relevant criteria. The largest platforms, for example, will still have to do a significant amount of work on risk assessments. However, every time a Minister stands up and talks about what they are requiring platforms and companies to do, they say, “Companies must stick to their terms of service. They must ensure that they enforce things in line with their terms of service.” If a company is finding it too difficult, it will just take the tough things out of their terms of service. It will take out transphobia, it will take out abuse. Twitter does not ban anyone for abuse anyway, it seems, but it will be easier for Twitter to say, “Ofcom is going to try to hold us for account for the fact that we are not getting rid of people for abusive but not illegal messages, even though we say in our terms of service, ‘You must act with respect’, or ‘You must not abuse other users’. We will just take that out of our terms of service so that we are not held to account for the fact that we are not following our terms of service.” Then, because the abuse is not illegal—because it does not meet that bar—those places will end up being even less safe than they are right now.

For example, occasionally Twitter does act in line with its terms of service, which is quite nice: it does ban people who are behaving inappropriately, but not necessarily illegally, on its platform. However, if it is required to implement that across the board for everybody, it will be far easier for Twitter to say, “We’ve sacked all our moderators—we do not have enough people to be able to do this job—so we will just take it all out of the terms of service. The terms of service will say, ‘We will ban people for sharing illegal content, full stop.’” We will end up in a worse situation than we are currently in, so the reliance on terms of service causes me a big, big problem.

Turning to amendment 100, dealing specifically with the accessibility of this feature for child users, I appreciate the ministerial clarification, and agree that my amendment could have been better worded and potentially causes some problems. However, can the Minister talk more about the level of accessibility? I would like children to be able to see a version of the terms of service that is age-appropriate, so that they understand what is expected of them and others on the platform, and understand when and how they can make a report and how that report will be acted on. The kids who are using Discord, TikTok or YouTube are over 13—well, some of them are—so they are able to read and understand, and they want to know how to make reports and for the reporting functions to be there. One of the biggest complaints we hear from kids is that they do not know how to report things they see that are disturbing.

A requirement for children to have an understanding of how reporting functions work, particularly on social media platforms where people are interacting with each other, and of the behaviour that is expected of them, does not mean that there cannot be a more in-depth and detailed version of the terms of service, laying out potential punishments using language that children may not be able to understand. The amendment would specifically ensure that children have an understanding of that.

We want children to have a great time on the internet. There are so many ace things out there and wonderful places they can access. Lego has been in touch, for example; its website is really pretty cool. We want kids to be able to access that stuff and communicate with their friends, but we also want them to have access to features that allow them to make reports that will keep them safe. If children are making reports, then platforms will say, “Actually, there is real problem with this because we are getting loads of reports about it.” They will then be able to take action. They will be able to have proper risk assessments in place because they will be able to understand what is disturbing people and what is causing the problems.

I am glad to hear the Minister’s words. If he were even more clear about the fact that he would expect children to be able to understand and access information about keeping themselves safe on the platforms, then that would be even more helpful.

Paul Scully Portrait Paul Scully
- Hansard - -

On terms and conditions, it is clearly best practice to have a different level of explanation that ensures children can fully understand what they are getting into. The hon. Lady talked about the fact that children do not know how to report harm. Frankly, judging by a lot of conversations we have had in our debates, we do not know how to report harm because it is not transparent. On a number of platforms, how to do that is very opaque.

A wider aim of the Bill is to make sure that platforms have better reporting patterns. I encourage platforms to do exactly what the hon. Member for Aberdeen North says to engage children, and to engage parents. Parents are well placed to engage with reporting and it is important that we do not forget parenting in the equation of how Government and platforms are acting. I hope that is clear to the hon. Lady. We are mainly relying on terms and conditions for adults, but the Bill imposes a wider set of protections for children on the platforms.

Amendment 4 agreed to.

Amendment made: 5, in clause 11, page 11, line 15, after “(5)” insert “, (6A)”.—(Paul Scully.)

This amendment ensures that the duty in clause 11(8) to have clear and accessible terms of service applies to the terms of service mentioned in the new subsection inserted by Amendment 4.

Clause 11, as amended, ordered to stand part of the Bill.

Clause 12

Adults’ risk assessment duties

Question proposed, That the clause stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause 13 stand part.

Government amendments 18, 23 to 25, 32, 33 and 39.

Clause 55 stand part.

Government amendments 42 to 45, 61 to 66, 68 to 70, 74, 80, 85, 92, 51 and 52, 54, 94 and 60.

Paul Scully Portrait Paul Scully
- Hansard - -

To protect free speech and remove any possibility that the Bill could cause tech companies to censor legal content, I seek to remove the so-called “legal but harmful” duties from the Bill. These duties are currently set out in clauses 12 and 13 and apply to the largest in-scope services. They require services to undertake risk assessments for defined categories of harmful but legal content, before setting and enforcing clear terms of service for each category of content.

I share the concerns raised by Members of this House and more broadly that these provisions could have a detrimental effect on freedom of expression. It is not right that the Government define what legal content they consider harmful to adults and then require platforms to risk assess for that content. Doing so may encourage companies to remove legal speech, undermining this Government’s commitment to freedom of expression. That is why these provisions must be removed.

At the same time, I recognise the undue influence that the largest platforms have over our public discourse. These companies get to decide what we do and do not see online. They can arbitrarily remove a user’s content or ban them altogether without offering any real avenues of redress to users. On the flip side, even when companies have terms of service, these are often not enforced, as we have discussed. That was the case after the Euro 2020 final where footballers were subject to the most appalling abuse, despite most platforms clearly prohibiting that. That is why I am introducing duties to improve the transparency and accountability of platforms and to protect free speech through new clauses 3 and 4. Under these duties, category 1 platforms will only be allowed to remove or restrict access to content or ban or suspend users when this is in accordance with their terms of service or where they face another legal obligation. That protects against the arbitrary removal of content.

Companies must ensure that their terms of service are consistently enforced. If companies’ terms of service say that they will remove or restrict access to content, or will ban or suspend users in certain circumstances, they must put in place proper systems and processes to apply those terms. That will close the gap between what companies say they will do and what they do in practice. Services must ensure that their terms of service are easily understandable to users and that they operate effective reporting and redress mechanisms, enabling users to raise concerns about a company’s application of the terms of service. We will debate the substance of these changes later alongside clause 18.

Clause 55 currently defines

“content that is harmful to adults”,

including

“priority content that is harmful to adults”

for the purposes of this legislation. As this concept would be removed with the removal of the adult safety duties, this clause will also need to be removed.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

My hon. Friend mentioned earlier that companies will not be able to remove content if it is not part of their safety duties or if it was not a breach of their terms of service. I want to be sure that I heard that correctly and to ask whether Ofcom will be able to risk assess that process to ensure that companies are not over-removing content.

Paul Scully Portrait Paul Scully
- Hansard - -

Absolutely. I will come on to Ofcom in a second and respond directly to his question.

The removal of clauses 12, 13 and 55 from the Bill, if agreed by the Committee, will require a series of further amendments to remove references to the adult safety duties elsewhere in the Bill. These amendments are required to ensure that the legislation is consistent and, importantly, that platforms, Ofcom and the Secretary of State are not held to requirements relating to the adult safety duties that we intend to remove from the Bill. The amendments remove requirements on platforms and Ofcom relating to the adult safety duties. That includes references to the adult safety duties in the duties to provide content reporting and redress mechanisms and to keep records. They also remove references to content that is harmful to adults from the process for designating category 1, 2A and 2B companies. The amendments in this group relate mainly to the process for the category 2B companies.

I also seek to amend the process for designating category 1 services to ensure that they are identified based on their influence over public discourse, rather than with regard to the risk of harm posed by content that is harmful to adults. These changes will be discussed when we debate the relevant amendments alongside clause 82 and schedule 11. The amendments will remove powers that will no longer be required, such as the Secretary of State’s ability to designate priority content that is harmful to adults. As I have already indicated, we intend to remove the adult safety duties and introduce new duties on category 1 services relating to transparency, accountability and freedom of expression. While they will mostly be discussed alongside clause 18, amendments 61 to 66, 68 to 70 and 74 will add references to the transparency, accountability and freedom of expression duties to schedule 8. That will ensure that Ofcom can require providers of category 1 services to give details in their annual transparency reports about how they comply with the new duties. Those amendments define relevant content and consumer content for the purposes of the schedule.

We will discuss the proposed transparency and accountability duties that will replace the adult safety duties in more detail later in the Committee’s deliberations. For the reasons I have set out, I do not believe that the current adult safety duties with their risks to freedom of expression should be retained. I therefore urge the Committee that clauses 12, 13 and 55 do not stand part and instead recommend that the Government amendments in this group are accepted.

None Portrait The Chair
- Hansard -

Before we proceed, I emphasise that we are debating clause 13 stand part as well as the litany of Government amendments that I read out.

ONLINE SAFETY BILL (Second sitting)

Paul Scully Excerpts
Committee stage (re-committed clauses and schedules)
Tuesday 13th December 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 13 December 2022 - (13 Dec 2022)
Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

I thank the shadow Minister for that intervention. She is absolutely right. We have had a discussion about terms of reference and terms of service. Not only do most people not actually fully read them or understand them, but they are subject to change. The moment Elon Musk took over Twitter, everything changed. Not only have we got Donald Trump back, but Elon Musk also gave the keys to a mainstream social media platform to Kanye West. We have seen what happened then.

That is the situation the Government will now not shut the door on. That is regrettable. For all the reasons we have heard today, it is really damaging. It is really disappointing that we are not taking the opportunity to lead in this area.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - -

It is a pleasure to serve under your chairmanship, Dame Angela.

A lot of the discussion has replayed the debate from day two on Report about the removal of “legal but harmful” measures. Some of the discussion this morning and this afternoon has covered really important issues such as self-harm on which, as we said on the Floor of the House, we will introduce measures at a later stage. I will not talk about those measures now, but I would just say that we have already said that if we agree that the promotion of things such as self-harm is illegal, it should be illegal. Let us be very straight about how we deal with the promotion of self-harm.

The Bill will bring huge improvements for adult safety online. In addition to their duty to tackle illegal content, companies will have to provide adult users with tools to keep themselves safer. On some of the other clauses, we will talk about the triple shield that was mentioned earlier. If the content is illegal, it will still be illegal. If content does not adhere to the companies’ terms of service—that includes many of the issues that we have been debating for the last hour—it will have to be removed. We will come to user enforcement issues in further clauses.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

The Minister mentions tools for adults to keep themselves safe. Does he not think that that puts the onus on the users—the victims—to keep themselves safe? The measures as they stand in the Bill put the onus on the companies to be more proactive about how they keep people safe.

Paul Scully Portrait Paul Scully
- Hansard - -

The onus on adults is very much a safety net—very much a catch-all, after we have put the onus on the social media companies and the platforms to adhere to their own terms and conditions.

We have heard a lot about Twitter and the changes to Twitter. We can see the commercial imperative for mainstream platforms, certainly the category 1 platforms, to have a wide enough catch-all in their terms of service—anything that an advertiser, for example, would see as reasonably sensible—to be able to remain a viable platform in the first place. When Elon Musk first started making changes at Twitter, a comment did the rounds: “How do you build a multimillion-dollar company? You sell it to Elon Musk for £44 billion.” He made that change. He has seen the bottom falling out of his market and has lost a lot of the cash he put into Twitter. That is the commercial impetus that underpins a lot of the changes we are making.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

Is the Minister really suggesting that it is reasonable for people to say, “Right, I am going to have to walk away from Facebook because I don’t agree with their terms of service,” to hold the platform to account? How does he expect people to keep in touch with each other if they have to walk away from social media platforms in order to try to hold them to account?

Paul Scully Portrait Paul Scully
- Hansard - -

I do not think the hon. Lady is seriously suggesting that people can communicate only via Facebook—via one platform. The point is that there are a variety of methods of communication, of which has been a major one, although it is not one of the biggest now, with its share value having dropped 71% in the last year. That is, again, another commercial impetus in terms of changing its platform in other, usability-related ways.

--- Later in debate ---
Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

One of the examples I alluded to, which is particularly offensive for Jewish people, LGBT people and other people who were persecuted in the Nazi holocaust, is holocaust denial. Does the Minister seriously think that it is only Jewish people, LGBT people and other people who were persecuted in the holocaust who find holocaust denial offensive and objectionable and who do not want to see it as part of their online experience? Surely having these sorts of safety nets in place and saying that we do not think that certain kinds of content—although they may not be against the law—have a place online protects everyone’s experience, whether they are Jewish or not. Surely, no one wants to see holocaust denial online.

Paul Scully Portrait Paul Scully
- Hansard - -

No, but there is freedom of expression to a point—when it starts to reach into illegality. We have to have the balance right: someone can say something in public—in any session offline—but what the hon. Lady is suggesting is that, as soon as they hit a keyboard or a smartphone, there are two totally different regimes. That is not getting the balance right.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

The Minister says that we should have freedom of speech up to a point. Does that point include holocaust denial? He has just suggested that if something is acceptable to say in person, which I do not think holocaust denial should be, it should be acceptable online. Surely holocaust denial is objectionable whenever it happens, in whatever context—online or offline.

Paul Scully Portrait Paul Scully
- Hansard - -

I have been clear about where I set the line. [Interruption.] I have said that if something is illegal, it is illegal. The terms of service of the platforms largely cover the list that we are talking about. As my hon. Friend the Member for Folkestone and Hythe and I have both said, the terms of service of the vast majority of platforms—the big category 1 platforms—set a higher bar than was in our original Bill. The hon. Member for Luton North talked about whether we should have more evidence. I understand that the pre-legislative scrutiny committee heard evidence and came to a unanimous conclusion that the “legal but harmful” conditions should not be in the Bill.

Kim Leadbeater Portrait Kim Leadbeater (Batley and Spen) (Lab)
- Hansard - - - Excerpts

A few moments ago, the Minister compared the online world to the real world. Does he agree that they are not the same? Sadly, the sort of thing that someone says in the pub on a Friday night to two or three of their friends is very different from someone saying something dangerously harmful online that can reach millions and billions of people in a very short space of time. The person who spoke in the pub might get up the following morning and regret what they said, but no harm was done. Once something is out there in the online world, very serious damage can be done very quickly.

Paul Scully Portrait Paul Scully
- Hansard - -

The hon. Lady makes a good point. I talked about the offline world rather than the real world, but clearly that can happen. That is where the balance has to be struck, as we heard from my hon. Friend the Member for Don Valley. It is not black and white; it is a spectrum of greys. Any sensible person can soon see when they stray into areas that we have talked about such as holocaust denial and extremism, but we do not want to penalise people who invariably are testing their freedom of expression.

It is a fine balance, but I think that we have reached the right balance between protecting freedom of expression and protecting vulnerable adults by having three layers of checks. The first is illegality. The second is enforcing the terms of service, which provide a higher bar than we had in the original Bill for the vast majority of platforms, so that we can see right at the beginning how they will be enforced by the platforms. If they change them and do not adhere them, Ofcom can step in. Ofcom can step in at any point to ensure that they are being enforced. The third is a safety net.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On illegal content, is the Minister proposing that the Government will introduce new legislation to make, for example, holocaust denial and eating disorder content illegal, whether it is online or offline? If he is saying that the bar in the online and offline worlds should be the same, will the Government introduce more hate crime legislation?

Paul Scully Portrait Paul Scully
- Hansard - -

Hate crime legislation will always be considered by the Ministry of Justice, but I am not committing to any changes. That is beyond my reach, but the two shields that we talked about are underpinned by a safety net.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Does my hon. Friend agree that the risk assessments that will be done on the priority illegal offences are very wide ranging, in addition to the risk assessments that will be done on meeting the terms of service? They will include racially and religiously motivated harassment, and putting people in fear of violence. A lot of the offences that have been discussed in the debate would already be covered by the adult safety risk assessments in the Bill.

Paul Scully Portrait Paul Scully
- Hansard - -

I absolutely agree. As I said in my opening remarks about the racial abuse picked up in relation to the Euro 2020 football championship, that would have been against the terms and conditions of all those platforms, but it still happened as the platforms were not enforcing those terms and conditions. Whether we put them on a list in the Bill or talk about them in the terms of the service, they need to be enforced, but the terms of service are there.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

On that point, does my hon. Friend also agree that the priority legal offences are important too? People were prosecuted for what they posted on Twitter and Instagram about the England footballers, so that shows that we understand what racially motivated offences are and that people are prosecuted for them. The Bill will require a minimum regulatory standard that meets that threshold and requires companies to act in cases such as that one, where we know what this content is, what people are posting and what is required. Not only will the companies have to act, but they will have to complete risk assessments to demonstrate how they will do that.

Paul Scully Portrait Paul Scully
- Hansard - -

Indeed. I absolutely agree with my hon. Friend and that is a good example of enforcement being used. People can be prosecuted if such abuse appears on social media, but a black footballer, who would otherwise have seen that racial abuse, can choose in the user enforcement to turn that off so that he does not see it. That does not mean that we cannot pursue a prosecution for racial abuse via a third-party complaint or via the platform.

None Portrait The Chair
- Hansard -

Order. Could the Minister address his remarks through the Chair so that I do not have to look at his back?

Paul Scully Portrait Paul Scully
- Hansard - -

I apologise, Dame Angela. I will bring my remarks to a close by saying that with those triple shields, we have the protections and the fine balance that we need.

Question put, That the clause, as amended, stand part of the Bill.

--- Later in debate ---
User empowerment duties
Paul Scully Portrait Paul Scully
- Hansard - -

I beg to move amendment 8, in clause 14, page 14, line 3, leave out “harmful content” and insert—

“content to which this subsection applies”.

This amendment, and Amendments 9 to 17, amend clause 14 (user empowerment) as the adult safety duties are removed (see Amendments 6, 7 and 41). New subsections (8B) to (8D) describe the kinds of content which are now relevant to the duty in clause 14(2) - see Amendment 15.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendments 9 to 14.

Government amendment 15, in clause 14, page 14, line 29, at end insert—

“(8A) Subsection (2) applies to content that—

(a) is regulated user-generated content in relation to the service in question, and

(b) is within subsection (8B), (8C) or (8D).

(8B) Content is within this subsection if it encourages, promotes or provides instructions for—

(a) suicide or an act of deliberate self-injury, or

(b) an eating disorder or behaviours associated with an eating disorder.

(8C) Content is within this subsection if it is abusive and the abuse targets any of the following characteristics—

(a) race,

(b) religion,

(c) sex,

(d) sexual orientation,

(e) disability, or

(f) gender reassignment.

(8D) Content is within this subsection if it incites hatred against people—

(a) of a particular race, religion, sex or sexual orientation,

(b) who have a disability, or

(c) who have the characteristic of gender reassignment.”

This amendment describes the content relevant to the duty in subsection (2) of clause 14. The effect is (broadly) that providers must offer users tools to reduce their exposure to these kinds of content.

Amendment (a), to Government amendment 15, at end insert—

“(8E) Content is within this subsection if it—

(a) incites hateful extremism,

(b) provides false information about climate change, or

(c) is harmful to health.”

Government amendment 16, in clause 14, page 14, line 30, leave out subsection (9) and insert—

“(9) In this section—

‘disability’ means any physical or mental impairment;

‘injury’ includes poisoning;

‘non-verified user’ means a user who has not verified their identity to the provider of a service (see section 58(1));

‘race’ includes colour, nationality, and ethnic or national origins.”

This amendment inserts definitions of terms now used in clause 14.

Amendment (a), to Government amendment 16, after “mental impairment;” insert—

“‘hateful extremism’ means activity or materials directed at an out-group who are perceived as a threat to an in-group motivated by or intended to advance a political, religious or racial supremacist ideology—

(a) to create a climate conducive to hate crime, terrorism or other violence, or

(b) to attempt to erode or destroy the rights and freedoms protected by article 17 (Prohibition of abuse of rights) of Schedule 1 of the Human Rights Act 1998.”

Government amendment 17.

Paul Scully Portrait Paul Scully
- Hansard - -

The Government recognise the importance of giving adult users greater choice about what they see online and who they interact with, while upholding users’ rights to free expression online. That is why we have removed the “legal but harmful” provisions from the Bill in relation to adults and replaced it with a fairer, simpler approach: the triple shield.

As I said earlier, the first shield will require all companies in scope to take preventive measures to tackle illegal content or activity. The second shield will place new duties on category 1 services to improve transparency and accountability, and protect free speech, by requiring them to adhere to their terms of service when restricting access to content or suspending or banning users. As I said earlier, user empowerment is the key third shield, empowering adults with a greater control over their exposure to legal forms of abuse or hatred, or content that encourages, promotes or provides instructions for suicide, self-harm or eating disorders. That has been done while upholding and protecting freedom of expression.

Amendments 9 and 12 will strengthen the user empowerment duty, so that the largest companies are required to ensure that those tools are effective in reducing the likelihood of encountering the listed content or alerting users to it, and are easy for users to access. That will provide adult users with greater control over their online experience.

We are also setting out the categories of content that those user empowerment tools apply to in the Bill, through amendment 15. Adult users will be given the choice of whether they want to take advantage of those tools to have greater control over content that encourages, promotes or provides instructions for suicide, self-harm and eating disorders, and content that targets abuse or incites hate against people on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment. This is a targeted approach, focused on areas where we know that adult users—particularly those who are vulnerable or disproportionately targeted by online hate and abuse—would benefit from having greater choice.

As I said, the Government remain committed to free speech, which is why we have made changes to the adult safety duties. By establishing high thresholds for inclusion in those content categories, we have ensured that legitimate debate online will not be affected by the user empowerment duties.

I want to emphasise that the user empowerment duties do not require companies to remove legal content from their services; they are about giving individual adult users the option to increase their control over those kinds of content. Platforms will still be required to provide users with the ability to filter out unverified users, if they so wish. That duty remains unchanged. For the reasons that I have set out, I hope that Members can support Government amendments 8 to 17.

I turn to the amendments in the name of the hon. Member for Pontypridd to Government amendments 15 and 16. As I have set out in relation to Government amendments 8 to 17, the Government recognise the intent behind the amendments—to apply the user empowerment tools in clause 14(2) to a greater range of content categories. As I have already set out, it is crucial that a tailored approach is taken, so that the user empowerment tools stay in balance with users’ rights to free expression online. I am sympathetic to the amendments, but they propose categories of content that risk being either unworkable for companies or duplicative to the approach already set out in amendment 15.

The category of

“content that is harmful to health”

sets an extremely broad scope. That risks requiring companies to apply the tools in clause 14(2) to an unfeasibly large volume of content. It is not a proportionate approach and would place an unreasonable burden on companies. It might also have concerning implications for freedom of expression, as it may capture important health advice. That risks, ultimately, undermining the intention behind the user empowerment tools in clause 14(2) by preventing users from accessing helpful content, and disincentivising users from using the features.

In addition, the category

“provides false information about climate change”

places a requirement on private companies to be the arbiters of truth on subjective and evolving issues. Those companies would be responsible for determining what types of legal content were considered false information, which poses a risk to freedom of expression and risks silencing genuine debate.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Did the Minister just say that climate change is subjective?

Paul Scully Portrait Paul Scully
- Hansard - -

No, not about whether climate change is happening, but we are talking about a wide range. “Provides false information”—how do the companies determine what is false? I am not talking about the binary question of whether climate change is happening, but climate change is a wide-ranging debate. “Provides false information” means that someone has to determine what is false and what is not. Basically, the amendment outsources that to the social media platforms. That is not appropriate.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

Would that not also apply to vaccine efficacy? If we are talking about everything being up for debate and nothing being a hard fact, we are entering slightly strange worlds where we undo a huge amount of progress, in particular on health.

Paul Scully Portrait Paul Scully
- Hansard - -

The amendment does not talk about vaccine efficacy; it talks about content that is harmful to health. That is a wide-ranging thing.

None Portrait The Chair
- Hansard -

Order. I am getting increasingly confused. The Minister appears to be answering a debate on an amendment that has not yet been moved. It might be helpful to the Committee, for good debate, if the Minister were to come back with his arguments against the amendment not yet moved by the Opposition spokesperson, the hon. Member for Pontypridd, once she has actually moved it. We can then hear her reasons for it and he can reply.

Paul Scully Portrait Paul Scully
- Hansard - -

In that case, having moved my amendment, I close my remarks.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Dame Angela. With your permission, I will take this opportunity to make some broad reflections on the Government’s approach to the new so-called triple-shield protection that we have heard so much about, before coming on to the amendment tabled in my name in the group.

Broadly, Labour is disappointed that the system-level approach to content that is harmful to adults is being stripped from the Bill and replaced with a duty that puts the onus on the user to keep themselves safe. As the Antisemitism Policy Trust among others has argued, the two should be able to work in tandem. The clause allows a user to manage what harmful material they see by requiring the largest or most risky service providers to provide tools to allow a person in effect to reduce their likelihood of encountering, or to alert them to, certain types of material. We have concerns about the overall approach of the Government, but Labour believes that important additions can be made to the list of content where user-empowerment tools must be in place, hence our amendment (a) to Government amendment 15.

In July, in a little-noticed written ministerial statement, the Government produced a prototype list of content that would be harmful to adults. The list included priority content that category 1 services need to address in their terms and conditions; online abuse and harassment—mere disagreement with another’s point of view would not reach the threshold for harmful content, and so would not be covered; circulation of real or manufactured intimate images without the subject’s consent; content promoting self-harm; content promoting eating disorders; legal suicide content; and harmful health content that is demonstrably false, such as urging people to drink bleach to cure cancer.

We have concerns about whether listing those harms in the Bill is the most effective mechanism, mostly because we feel that the list should be more flexible and able to change according to the issues of the day, but it is clear that the Government will continue to pursue this avenue despite some very worrying gaps. With that in mind, will the Minister clarify what exactly underpins that list if there have been no risk assessments? What was the basis for drawing up that specific list? Surely the Government should be implored to publish the research that determined the list, at the very least.

I recognise that the false communications offence has remained in the Bill, but the list in Government amendment 15 is not exhaustive. Without the additions outlined in our amendment (a) to amendment 15, the list will do little to tackle some of the most pressing harm of our time, some of which we have already heard about today.

I am pleased that the list from the written ministerial statement has more or less been reproduced in amendment 15, under subsection (2), but there is a key and unexplained omission that our amendment (a) to it seeks to correct: the absence of the last point, on harmful health content. Amendment (a) seeks to reinsert such important content into the Bill directly. It seems implausible that the Government failed to consider the dangerous harm that health misinformation can have online, especially given that back in July they seemed to have a grasp of its importance by including it in the original list.

We all know that health-related misinformation and disinformation can significantly undermine public health, as we have heard. We only have to cast our minds back to the height of the coronavirus pandemic to remind ourselves of how dangerous the online space was, with anti-vax scepticism being rife. Many groups were impacted, including pregnant women, who received mixed messages about the safety of covid vaccination, causing widespread confusion, fear and inaction. By tabling amendment (a) to amendment 15, we wanted to understand why the Government have dropped that from the list and on what exact grounds.

In addition to harmful health content, our amendment (a) to amendment 15 would also add to the list content that incites hateful extremism and provides false information about climate change, as we have heard. In early written evidence from Carnegie, it outlined how serious the threat of climate change disinformation is to the UK. Malicious actors spreading false information on social media could undermine collective action to combat the threats. At present, the Online Safety Bill is not designed to tackle those threats head on.

We all recognise that social media is an important source of news and information for many people, and evidence is emerging of its role in climate change disinformation. The Centre for Countering Digital Hate published a report in 2021 called “The Toxic Ten: How ten fringe publishers fuel 69% of digital climate change denial”, which explores the issue further. Further analysis of activity on Facebook around COP26 undertaken by the Institute for Strategic Dialogue demonstrates the scale of the challenge in dealing with climate change misinformation and disinformation. The research compared the levels of engagement generated by reliable, scientific organisations and climate-sceptic actors, and found that posts from the latter frequently received more traction and reach than the former, which is shocking. For example, in the fortnight in which COP26 took place, sceptic content garnered 12 times the level of engagement that authoritative sources did on the platform, and 60% of the sceptic posts analysed could be classified as actively and explicitly attacking efforts to curb climate change, which just goes to show the importance of ensuring that climate change disinformation is also included in the list in Government amendment 15.

Our two amendments—amendment (a) to amendment 15, and amendment (a) to amendment 16 —seek to ensure that the long-standing omission from the Bill of hateful extremism is put right here as a priority. There is increasing concern about extremism leading to violence and death that does not meet the definition for terrorism. The internet and user-to-user services play a central role in the radicalisation process, yet the Online Safety Bill does not cover extremism.

Colleagues may be aware that Sara Khan, the former lead commissioner for countering extremism, provided a definition of extremism for the Government in February 2021, but there has been no response. The issue has been raised repeatedly by Members across the House, including by my hon. Friend the Member for Plymouth, Sutton and Devonport (Luke Pollard), following the tragic murders carried out by a radicalised incel in his constituency.

Amendment (a) to amendment 16 seeks to bring a formal definition of hateful extremism into the Bill and supports amendment (a) to amendment 15. The definition, as proposed by Sara Khan, who was appointed as Britain’s first countering extremism commissioner in 2018, is an important first step in addressing the gaps that social media platforms and providers have left open for harm and radicalisation.

Social media platforms have often been ineffective in removing other hateful extremist content. In November 2020, The Guardian reported that research from the Centre for Countering Digital Hate had uncovered how extremist merchandise had been sold on Facebook and Instagram to help fund neo-Nazi groups. That is just one of a huge number of instances, and it goes some way to suggest that a repeatedly inconsistent and ineffective approach to regulating extremist content is the one favoured by some social media platforms.

I hope that the Minister will seriously consider the amendments and will see the merits in expanding the list in Government amendment 15 to include these additional important harms.

--- Later in debate ---
Allowing such content freely on platforms and doing nothing to ensure that smaller but high-harm platforms are brought into the remit of this Bill is a backward step. We should be strengthening, not weakening, the Bill in this Committee. That is why I oppose the Government’s position and wholeheartedly support the Opposition’s amendments to clause 14.
Paul Scully Portrait Paul Scully
- Hansard - -

I have talked a little already about these amendments, so let me sum up where I think we are. I talked about harmful health content and why it is not included. The Online Safety Bill will force social media companies to tackle health misinformation and disinformation online, where it constitutes a criminal offence. It includes the communications offence, which would capture posts encouraging dangerous hoax cures, where the sender knows the information to be false and intends to cause harm, such as encouraging drinking bleach to cure cancer, which we heard about a little earlier.

The legislation is only one part of the wider Government approach to this issue. It includes the work of the counter-disinformation unit, which brings together cross-Government monitoring and analysis capabilities and engages with platforms directly to ensure that appropriate action is taken, in addition to the Government’s work to build users’ resilience to misinformation through media literacy.

Including harmful health content as a category risks requiring companies to apply the adult user empowerment tools to an unfeasibly large volume of content—way beyond just the vaccine efficacy that was mentioned. That has implications both for regulatory burden and for freedom of expression, as it may capture important health advice. Similarly, on climate change, the Online Safety Bill itself will introduce new transparency, accountability and free speech duties and category one services. If a platform said that certain types of content are not allowed, it will be held to account for their removal.

We recognised that there was a heightened risk of disinformation surrounding the COP26 summit. The counter-disinformation unit led by the Department for Digital, Culture, Media and Sport brought together monitoring and analysis capabilities across Government to understand disinformation that posed a risk to public safety or to delegates or that represented attempts at interference from malign actors. We are clear that free debate is essential to a democracy and that the counter-disinformation unit should not infringe upon political debate. Government already work closely with the major social media platforms to encourage them to collaborate at speed to remove disinformation as per their terms of service.

Amendment (a) to amendment 15 and amendment (a) to amendment 16 would create that new category of content that incites hateful extremism. That is closely aligned with the approach that the Government are already taking with amendment 15, specifically subsections (8C) and (8D), which create a category of content that is abusive or incites hate on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment. Those conditions would likely capture the majority of the kinds of content that the hon. Members are seeking to capture through their hateful extremism category. For example, it would capture antisemitic abuse and conspiracy theories, racist abuse and promotion of racist ideologies.

Furthermore, where companies’ terms of service say they prohibit or apply restrictions to the kind of content listed in the Opposition amendments, companies must ensure that those terms are consistently enforced. It comes back so much to the enforcement. They must also ensure that the terms of service are easily understandable.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

If this is about companies enforcing what is in their terms of service for the use of their platforms, could it not create a perverse incentive for them to have very little in their terms of service? If they will be punished for not enforcing their terms of service, surely they will want them to be as lax as possible in order to limit their legal liability for enforcing them. Does the Minister follow?

Paul Scully Portrait Paul Scully
- Hansard - -

I follow, but I do not agree. The categories of content in proposed new subsections (8C) and (8D), introduced by amendment 15, underpin a lot of this. I answered the question in an earlier debate when talking about the commercial impetus. I cannot imagine many mainstream advertisers wanting to advertise with a company that removed from its terms of service the exclusion of racial abuse, misogyny and general abuse. We have seen that commercial impetus really kicking in with certain platforms. For those reasons, I am unable to accept the amendments to the amendments, and I hope that the Opposition will not press them to a vote.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am grateful for the opportunity to push the Minister further. I asked him whether he could outline where the list in amendment 15 came from. Will he publish the research that led him to compile that specific list of priority harms?

Paul Scully Portrait Paul Scully
- Hansard - -

The definitions that we have taken are ones that strike the right balance and have a comparatively high threshold, so that they do not capture challenging and robust discussions on controversial topics.

Amendment 8 agreed to.

Amendments made: 9, in clause 14, page 14, line 5, after “to” insert “effectively”.

This amendment strengthens the duty in this clause by requiring that the systems or processes used to deal with the kinds of content described in subsections (8B) to (8D) (see Amendment 15) should be designed to effectively increase users’ control over such content.

Amendment 10, in clause 14, page 14, line 6, leave out from “encountering” to “the” in line 7 and insert

“content to which subsection (2) applies present on”.

This amendment inserts a reference to the kinds of content now relevant for this clause, instead of referring to priority content that is harmful to adults.

Amendment 11, in clause 14, page 14, line 9, leave out from “to” to end of line 10 and insert

“content present on the service that is a particular kind of content to which subsection (2) applies”.—(Paul Scully.)

This amendment inserts a reference to the kinds of content now relevant for this clause, instead of referring to priority content that is harmful to adults.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move amendment 102, in clause 14, page 14, line 12, leave out “made available to” and insert “in operation for”.

This amendment, and Amendment 103, relate to the tools proposed in Clause 14 which will be available for individuals to use on platforms to protect themselves from harm. This amendment specifically forces platforms to have these safety tools “on” by default.

--- Later in debate ---
Kim Leadbeater Portrait Kim Leadbeater
- Hansard - - - Excerpts

I will speak briefly in favour of amendments 102 and 103. As I mentioned a few moments ago, legal but harmful content can act as the gateway to dangerous radicalisation and extremism. Such content, hosted by mainstream social media platforms, should not be permitted unchecked online. I appreciate tható for children the content will be banned, but I strongly believe that the default position should be for such content to be hidden by default to all adult users, as the amendments would ensure.

The chain of events that leads to radicalisation, as I spelt out, relies on groups and individuals reaching people unaware that they are being radicalised. The content is posted in otherwise innocent Facebook groups, forums or Twitter threads. Adding a toggle, hidden somewhere in users’ settings, which few people know about or use, will do nothing to stop that. It will do nothing to stop the harmful content from reaching vulnerable and susceptible users.

We, as legislators, have an obligation to prevent at root that harmful content reaching and drawing in those vulnerable and susceptible to the misinformation and conspiracy spouted by vile groups and individuals wishing to spread their harm. The only way that we can make meaningful progress is by putting the responsibility squarely on platforms, to ensure that by default users do not come across the content in the first place.

Paul Scully Portrait Paul Scully
- Hansard - -

In the previous debate, I talked about amendment 15, which brought in a lot of protections against content that encourages and promotes, or provides instruction for, self-harm, suicide or eating disorders, and against content that is abusive or incites hate on the base of race, religion, disability, sex, gender reassignment or sexual orientation. We have also placed a duty on the largest platforms to offer adults the option to filter out unverified users if they so wish. That is a targeted approach that reflects areas where vulnerable users in particular could benefit from having greater choice and control. I come back to the fact that that is the third shield and an extra safety net. A lot of the extremes we have heard about, which have been used as debating points, as important as they are, should very much be wrapped up by the first two shields.

We have a targeted approach, but it is based on choice. It is right that adult users have a choice about what they see online and who they interact with. It is right that this choice lies in the hands of those adults. The Government mandating that these tools be on by default goes against the central aim of users being empowered to choose for themselves whether they want to reduce their engagement with some kinds of legal content.

We have been clear right from the beginning that it is not the Government’s role to say what legal content adults should or should not view online or to incentivise the removal of legal content. That is why we removed the adult legal but harmful duties in the first place. I believe we are striking the right balance between empowering adult users online and protecting freedom of expression. For that reason, I am not able to accept the amendments from the hon. Member for Pontypridd.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is disappointing that the Government are refusing to back these amendments to place the toggle as “on” by default. It is something that we see as a safety net, as the Minister described. Why would someone have to choose to have the safety net there? If someone does not want it, they can easily take it away. The choice should be that way around, because it is there to protect all of us.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

My hon. Friend makes a very good point. It goes to show the nature of this as a protection for all of us, even MPs, from accessing content that could be harmful to our health or, indeed, profession. Given the nature of the amendment, we feel that this is a safety net that should be available to all. It should be on by default.

Paul Scully Portrait Paul Scully
- Hansard - -

I should say that in the spirit of choice, companies can also choose to default it to be switched off in the first place as well.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister makes the point that companies can choose to have it off by default, but we would not need this Bill in the first place if companies did the right thing. Let us be clear: we would not have had to be here debating this for the past five years —for me it has been 12 months—if companies were going to do the right thing and protect people from harmful content online. On that basis, I will push the amendments to a vote.

Question put, That the amendment be made.

--- Later in debate ---
I am asking the Minister to make it really clear that these tools should be available and accessible to everybody, and that Ofcom will look at that availability and accessibility and listen to experts who say that there is a real issue with a certain website because the tools are not as accessible as they should be. Would the Minister be kind enough to make that incredibly clear, so that platforms are aware of the direction and the intention? Ofcom also needs to be aware that this is a priority and that these tools should be available to everyone in order to provide that level of accessibly, and in order that everybody can enjoy cat videos.
Paul Scully Portrait Paul Scully
- Hansard - -

I am happy to do that. In the same way that we spoke this morning about children’s protection, I am very aware of the terms of service and what people are getting into by looking for cats or whatever they want to do.

The Bill requires providers to make all the usual enforcement and protection tools available to all adults, including those with learning disabilities. Clause 14(4) makes it explicitly clear that features offered by providers, in compliance with the duty for users to be given greater control over the content that they see, must be made available to all adult users. Clause 14(5) further outlines that providers must have clear and accessible terms of service about what tools are offered in their service and how users may take advantage of them. We have strengthened the accessibility of the user enforcement duties through Government amendment 12 as well, to make sure that user enforcement tools and features are easy for users to access.

In addition, clause 58(1) says that providers must offer all adult users the option to verify themselves so that vulnerable users, including those with learning disabilities, are not at a disadvantage as a result of the user empowerment duties. Clause 59(2) and (3) further stipulate that in producing the guidance for providers about the user verification duty, Ofcom must have particular regard to the desirability of making identity verification available to vulnerable adult users, and must consult with persons who represent the interests of vulnerable adult users. That is about getting the thoughts of experts and advocates into their processes to make sure that they can enforce what is going on.

In addition, Ofcom is subject to the public sector equality duty, so it will have to take into account the ways in which people with disabilities may be impacted when performing its duties, such as writing its codes of practice for the user empowerment duty. I hope the hon. Member will appreciate the fact that, in a holistic way, that covers the essence of exactly what she is trying to do in her amendment, so I do not believe her amendment is necessary.

--- Later in debate ---
This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41.)
Paul Scully Portrait Paul Scully
- Hansard - -

I beg to move amendment 19, in clause 18, page 19, line 32, leave out from “also” to second “section”.

This is a technical amendment relating to Amendment 20.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendments 20 and 21, 26 and 27, 30, 34 and 35, 67, 71, 46 and 47, 50, 53, 55 to 57, and 95.

Government new clause 3—Duty not to act against users except in accordance with terms of service.

Government new clause 4—Further duties about terms of service.

Government new clause 5—OFCOM’s guidance about duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further

duties about terms of service).

Government new clause 6—Interpretation of this Chapter.

Paul Scully Portrait Paul Scully
- Hansard - -

I am seeking to impose new duties on category 1 services to ensure that they are held accountable to their terms of service and to protect free speech. Under the status quo, companies get to decide what we do and do not see online. They can arbitrarily ban users or remove their content without offering any form of due process and with very few avenues for users to achieve effective redress. On the other hand, companies’ terms of service are often poorly enforced, if at all.

I have mentioned before the horrendous abuse suffered by footballers around the 2020 Euro final, despite most platforms’ terms and conditions clearly not allowing that sort of content. There are countless similar instances, for example, relating to antisemitic abuse—as we have heard—and other forms of hate speech, that fall below the criminal threshold.

This group of amendments relates to a series of new duties that will fundamentally reset the relationship between platforms and their users. The duties will prevent services from arbitrarily removing content or suspending users without offering users proper avenues to appeal. At the same time, they will stop companies making empty promises to their users about their terms of service. The duties will ensure that where companies say they will remove content or ban a user, they actually do.

Government new clause 3 is focused on protecting free speech. It would require providers of category 1 services to remove or restrict access to content, or ban or suspend users, only where this is consistent with their terms of service. Ofcom will oversee companies’ systems and processes for discharging those duties, rather than supervising individual decisions.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am grateful for what the Minister has said, and glad that Ofcom will have a role in seeing that companies do not remove content that is not in breach of terms of service where there is no legal requirement to do so. In other areas of the Bill where these duties exist, risk assessments are to be conducted and codes of practice are in place. Will there similarly be risk assessments and codes of practice to ensure that companies comply with their freedom of speech obligations?

Paul Scully Portrait Paul Scully
- Hansard - -

Absolutely. As I say, it is really important that people understand right at the beginning, through risk assessments, what they are signing up for and what they can expect. To come back to the point of whether someone is an adult or a child, it is really important that parents lean in when it comes to children’s protections; that is a very important tool in the armoury.

New clause 4 will require providers of category 1 services to ensure that what their terms of service say about their content moderation policies is clear and accessible. Those terms have to be easy for users to understand, and should have sufficient detail, so that users know what to expect, in relation to moderation actions. Providers of category 1 services must apply their terms of service consistently, and they must have in place systems and processes that enable them to enforce their terms of service consistently.

These duties will give users the ability to report any content or account that they suspect does not meet a platform’s terms of service. They will also give users the ability to make complaints about platforms’ moderation actions, and raise concerns if their content is removed in error. Providers will be required to take appropriate action in response to complaints. That could include removing content that they prohibit, or reinstating content removed in error. These duties ensure that providers are made aware of issues to do with their services and require them to take action to resolve them, to keep users safe, and to uphold users’ rights to free speech.

The duties set out in new clauses 3 and 4 will not apply to illegal content, content that is harmful to children or consumer content. That is because illegal content and content that is harmful to children are covered by existing duties in the Bill, and consumer content is already regulated under consumer protection legislation. Companies will also be able to remove any content where they have a legal obligation to do so, or where the user is committing a criminal offence, even if that is not covered in their terms of service.

New clause 5 will require Ofcom to publish guidance to help providers of category 1 services to understand what they need to do to comply with their new duties. That could include guidance on how to make their terms of service clear and easy for users to understand, and how to operate an effective reporting and redress mechanism. The guidance will not prescribe what types of content companies should include in their terms of service, or how they should treat such content. That will be for companies to decide, based on their knowledge of their users, and their brand and commercial incentives, and subject to their other legal obligations.

New clause 6 clarifies terms used in new clauses 3 and 4. It also includes a definition of “Consumer content”, which is excluded from the main duties in new clauses 3 and 4. This covers content that is already regulated by the Competition and Markets Authority and other consumer protection bodies, such as content that breaches the Consumer Protection from Unfair Trading Regulations 2008. These definitions are needed to provide clarity to companies seeking to comply with the duties set out in new clauses 3 and 4.

The remaining amendments to other provisions in the Bill are consequential on the insertion of these new transparency, accountability and free speech duties. They insert references to the new duties in, for example, the provisions about content reporting, enforcement, transparency and reviewing compliance. That will ensure that the duties apply properly to the new measure.

Amendment 30 removes the duty on platforms to include clear and accessible provisions in their terms of service informing users that they have a right of action in court for breach of contract if a platform removes or restricts access to their content in violation of its terms of service. This is so that the duty can be moved to new clause 4, which focuses on ensuring that platforms comply with their terms of service. The replacement duty in new clause 4 will go further than the original duty, in that it will cover suspensions and bans of users as well as restrictions on content.

Amendments 46 and 47 impose a new duty on Ofcom to have regard to the need for it to be clear to providers of category 1 services what they must do to comply with their new duties. These amendments will also require Ofcom to have regard to the extent to which providers of category 1 services are demonstrating, in a transparent and accountable way, how they are complying with their new duties.

Lastly, amendment 95 temporarily exempts video-sharing platforms that are category 1 services from the new terms of service duties, as set out in new clauses 3 and 4, until the Secretary of State agrees that the Online Safety Bill is sufficiently implemented. This approach simultaneously maximises user protections by the temporary continuation of the VSP regime and minimises burdens for services and Ofcom. The changes are central to the Government’s intention to hold companies accountable for their promises. They will protect users in a way that is in line with companies’ terms of service. They are a critical part of the triple shield, which aims to protect adults online. It ensures that users are safe by requiring companies to remove illegal content, enforce their terms of service and provide users with tools to control their online experiences. Equally, these changes prevent arbitrary or random content removal, which helps to protect pluralistic and robust debate online. For those reasons, I hope that Members can support the amendments.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

This is an extremely large grouping so, for the sake of the Committee, I will do my best to keep my comments focused and brief where possible. I begin by addressing Government new clauses 3 and 4 and the consequential amendments.

Government new clause 3 introduces new duties that aim to ensure that the largest or most risky online service providers design systems and processes that ensure they cannot take down or restrict content in a way prevents a person from seeing it without further action by the user, or ban users, except in accordance with their own terms of service, or if the content breaks the law or contravenes the Online Safety Bill regime. This duty is referred to as the duty not to act against users except in accordance with terms of service. In reality, that will mean that the focus remains far too much on the banning, taking down and restriction of content, rather than our considering the systems and processes behind the platforms that perpetuate harm.

Labour has long held the view that the Government have gone down an unhelpful cul-de-sac on free speech. Instead of focusing on defining exactly which content is or is not harmful, the Bill should be focused on the processes by which harmful content is amplified on social media. We must recognise that a person posting a racist slur online that nobody notices, shares or reads is significantly less harmful than a post that can quickly go viral, and can within hours gain millions of views or shares. We have talked a lot in this place about Kanye West and the comments he has made on Twitter in the past few weeks. It is safe to say that a comment by Joe Bloggs in Hackney that glorifies Hitler does not have the same reach or produce the same harm as Kanye West saying exactly the same thing to his 30 million Twitter followers.

Our approach has the benefit of addressing the things that social media companies can control—for example, how content spreads—rather than the things they cannot control, such as what people say online. It reduces the risk to freedom of speech because it tackles how content is shared, rather than relying entirely on taking down harmful content. Government new clause 4 aims to improve the effectiveness of platforms’ terms of service in conjunction with the Government’s new triple shield, which the Committee has heard a lot about, but the reality is they are ultimately seeking to place too much of the burden of protection on extremely flexible and changeable terms of service.

If a provider’s terms of service say that certain types of content are to be taken down or restricted, then providers must run systems and processes to ensure that that can happen. Moreover, people must be able to report breaches easily, through a complaints service that delivers appropriate action, including when the service receives complaints about the provider. This “effectiveness” duty is important but somewhat misguided.

The Government, having dropped some of the “harmful but legal” provisions, seem to expect that if large and risky services—the category 1 platforms—claim to be tackling such material, they must deliver on that promise to the customer and user. This reflects a widespread view that companies may pick and choose how to apply their terms of service, or implement them loosely and interchangeably, as we have heard. Those failings will lead to harm when people encounter things that they would not have thought would be there when they signed up. All the while, service providers that do not fall within category 1 need not enforce their terms of service, or may do so erratically or discriminatorily. That includes search engines, no matter how big.

This large bundle of amendments seems to do little to actually keep people safe online. I have already made my concerns about the Government’s so-called triple shield approach to internet safety clear, so I will not repeat myself. We fundamentally believe that the Government’s approach, which places too much of the onus on the user rather than the platform, is wrong. We therefore cannot support the approach that is taken in the amendments. That being said, the Minister can take some solace from knowing that we see the merits of Government new clause 5, which

“requires OFCOM to give guidance to providers about complying with the duties imposed by NC3 and NC4”.

If this is the avenue that the Government insist on going down, it is absolutely vital that providers are advised by Ofcom on the processes they will be required to take to comply with these new duties.

Amendment 19 agreed to.

Amendment made: 20, in clause 18, page 19, line 33, at end insert

“, and

(b) section (Further duties about terms of service)(5)(a) (reporting of content that terms of service allow to be taken down or restricted).”—(Paul Scully.)

This amendment inserts a signpost to the new provision about content reporting inserted by NC4.

Clause 18, as amended, ordered to stand part of the Bill.

Clause 19

Duties about complaints procedures

Amendment made: 21, in clause 19, page 20, line 15, leave out “, (3) or (4)” and insert “or (3)”.—(Paul Scully.)

This amendment removes a reference to clause 20(4), as that provision is moved to NC4.

Paul Scully Portrait Paul Scully
- Hansard - -

I beg to move amendment 22, in clause 19, page 20, line 27, leave out from “down” to “and” in line 28 and insert

“or access to it being restricted, or given a lower priority or otherwise becoming less likely to be encountered by other users,”.

NC2 states what is meant by restricting users’ access to content, and this amendment makes a change in line with that, to avoid any implication that downranking is a form of restriction on access to content.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendment 59.

Government new clause 2—Restricting users’ access to content.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

These amendments clarify the meaning of “restricting access to content” and “access to content” for the purposes of the Bill. Restricting access to content is an expression that is used in various provisions across the Bill, such as in new clause 2, under which providers of category 1 services will have a duty to ensure that they remove or restrict access to users’ content only where that is in accordance with their terms of service or another legal obligation. There are other such references in clauses 15, 16 and 17.

The amendments make it clear that the expression

“restricting users’ access to content”

covers cases where a provider prevents a user from accessing content without that user taking a prior step, or where content is temporarily hidden from a user. They also make it clear that this expression does not cover any restrictions that the provider puts in place to enable users to apply user empowerment tools to limit the content that they encounter, or cases where access to content is controlled by another user, rather than by the provider.

The amendments are largely technical, but they do cover things such as down-ranking. Amendment 22 is necessary because the previous wording of this provision wrongly suggested that down-ranking was covered by the expression “restricting access to content”. Down-ranking is the practice of giving content a lower priority on a user’s feed. The Government intend that users should be able to complain if they feel that their content has been inappropriately down-ranked as a result of the use of proactive technology. This amendment ensures consistency.

I hope that the amendments provide clarity as to the meaning of restricting access to content for those affected by the Bill, and assist providers with complying with their duties.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Again, I will keep my comments on clause 19 brief, as we broadly support the intentions behind the clause and the associated measures in the grouping. My hon. Friend the Member for Worsley and Eccles South (Barbara Keeley) spoke at length about this important clause, which relates to the all-important complaints procedures available around social media platforms and companies, in the previous Bill Committee.

During the previous Committee, Labour tabled amendments that would have empowered more individuals to make a complaint about search content in the event of non-compliance. In addition, we wanted an external complaints option for individuals seeking redress. Sadly, all those amendments were voted down by the last Committee, but I must once again press the Minister on those points, particularly in the context of the new amendments that have been tabled.

Without redress for individual complaints, once internal mechanisms have been exhausted, victims of online abuse could be left with no further options. Consumer protections could be compromised and freedom of expression, with which the Government seem to be borderline obsessed, could be infringed for people who feel that their content has been unfairly removed.

Government new clause 2 deals with the meaning of references to

“restricting users’ access to content”,

in particular by excluding restrictions resulting from the use of user empowerment tools as described in clause 14. We see amendments 22 and 59 as important components of new clause 2, and are therefore more than happy to support them. However, I reiterate to the Minister and place on the record once again the importance of introducing an online safety ombudsman, which we feel is crucial to new clause 2. The Joint Committee recommended the introduction of such an ombudsman, who would consider complaints when internal routes of redress had not resulted in resolution, had failed to address risk and had led to significant and demonstrable harm. As new clause 2 relates to restricting users’ access to content, we must also ensure that there is an appropriate channel for complaints if there is an issue that users wish to take up around restrictions in accessing content.

By now, the Minister will be well versed in my thoughts on the Government’s approach, and on the reliance on the user empowerment tool approach more broadly. It is fundamentally an error to pursue a regime that is so content-focused. Despite those points, we see the merits in Government amendments 22 and 59, and in new clause 2, so have not sought to table any further amendments at this stage.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am slightly confused, and would appreciate a little clarification from the Minister. I understand what new clause 2 means; if the hon. Member for Pontypridd says that she does not want to see content of a certain nature, and I put something of that nature online, I am not being unfairly discriminated against in any way because she has chosen to opt out of receiving that content. I am slightly confused about the downgrading bit.

I know that an awful lot of platforms use downgrading when there is content that they find problematic, or something that they feel is an issue. Rather than taking that content off the platform completely, they may just no longer put it in users’ feeds, for example; they may move it down the priority list, and that may be part of what they already do to keep people safe. I am not trying to criticise what the Government are doing, but I genuinely do not understand whether that downgrading would still be allowed, whether it would be an issue, and whether people could complain about their content being downgraded because the platform was a bit concerned about it, and needed to check it out and work out what was going on, or if it was taken off users’ feeds.

Some companies, if they think that videos have been uploaded by people who are too young to use the platform, or by a registered child user of the platform, will not serve that content to everybody’s feeds. I will not be able to see something in my TikTok feed that was published by a user who is 13, for example, because there are restrictions on how TikTok deals with and serves that content, in order to provide increased protection and the safety that they want on their services.

Will it still be acceptable for companies to have their own internal downgrading system, in order to keep people safe, when content does not necessarily meet an illegality bar or child safety duty bar? The Minister has not used the phrase “market forces”; I think he said “commercial imperative”, and he has talked a lot about that. Some companies and organisations use downgrading to improve the systems on their site and to improve the user experience on the platform. I would very much appreciate it if the Minister explained whether that will still be the case. If not, will we all have a worse online experience as a result?

Paul Scully Portrait Paul Scully
- Hansard - -

I will have a go at that, but I am happy to write to the hon. Lady if I do not respond as fully as she wants. Down-ranking content is a moderation action, as she says, but it is not always done just to restrict access to content; there are many reasons why people might want to do it. Through these changes, we are saying that the content is not actually being restricted; it can still be seen if it is searched for or otherwise encountered. That is consistent with the clarification.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

This is quite an important point. The hon. Member for Aberdeen North was talking about recommendation systems. If a platform chooses not to amplify content, that is presumably not covered. As long as the content is accessible, someone could search and find it. That does not inhibit a platform’s decision, for policy reasons or whatever, not to actively promote it.

Paul Scully Portrait Paul Scully
- Hansard - -

Absolutely. There are plenty of reasons why platforms will rank users’ content, including down-ranking it. Providing personal content recommendations will have that process in it as well. It is not practical to specify that restricting access includes down-ranking. That is why we made that change.

Amendment 22 agreed to.

Amendments made: 23, in clause 19, page 21, line 7, leave out from “The” to “complaints” in line 10 and insert

“relevant kind of complaint for Category 1 services is”.

This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 24, in clause 19, page 21, line 12, leave out sub-paragraph (i).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 25, in clause 19, page 21, line 18, leave out paragraphs (c) and (d).

This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 26, in clause 19, page 21, line 33, leave out from “also” to second “section”.

This is a technical amendment relating to Amendment 27.

Amendment 27, in clause 19, page 21, line 34, at end insert

“, and

(b) section (Further duties about terms of service)(6) (complaints procedure relating to content that terms of service allow to be taken down or restricted).”—(Paul Scully.)

This amendment inserts a signpost to the new provision about complaints procedures inserted by NC4.

Clause 19, as amended, ordered to stand part of the Bill.

Clause 20

Duties about freedom of expression and privacy

Paul Scully Portrait Paul Scully
- Hansard - -

I beg to move amendment 28, in clause 20, page 21, line 42, after “have” insert “particular”.

This amendment has the result that providers of regulated user-to-user services must have particular regard to freedom of expression when deciding on and implementing safety measures and policies.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 29, 31, 36 to 38 and 40.

Paul Scully Portrait Paul Scully
- Hansard - -

I will be brief. The rights to freedom of expression and privacy are essential to our democracy. We have long been clear that the Bill must not interfere with those rights. The amendments will further strengthen protections for freedom of expression and privacy and ensure consistency in the Bill. They require regulated user-to-user and search services to have particular regard to freedom of expression and privacy when deciding on and implementing their safety measures and policy.

Amendments 28, 29 and 31 mean that service providers will need to thoroughly consider the impact that their safety and user empowerment measures have on users’ freedom of expression and privacy. That could mean, for example, providing detailed guidance and training for human reviewers about content that is particularly difficult to assess. Amendments 36 and 37 apply that to search services in relation to their safety duties. Ofcom can take enforcement action against services that fail to comply with those duties and will set out steps that platforms can take to safeguard freedom of expression and privacy in their codes of practice.

Those changes will not detract from platforms’ illegal content and child protection duties. Companies must tackle illegal content and ensure that children are protected on their services, but the amendments will protect against platforms taking an over-zealous approach to removing content or undermining users’ privacy when complying with their duties. Amendments 38 and 40 ensure that the rest of the Bill is consistent with those changes. The new duties will therefore ensure that companies give proper consideration to users’ rights when complying with them, and that that is reflected in Ofcom’s codes, providing greater clarity to companies.

Amendment 28 agreed to.

Amendments made: 29, in clause 20, page 22, line 2, after “have” insert “particular”.

This amendment has the result that providers of regulated user-to-user services must have particular regard to users’ privacy when deciding on and implementing safety measures and policies.

Amendment 30, in clause 20, page 22, line 6, leave out subsection (4).

This amendment removes clause 20(4), as that provision is moved to NC4.

Amendment 31, in clause 20, page 22, line 37, leave out paragraph (c) and insert—

“(c) section 14 (user empowerment),”.—(Paul Scully.)

The main effect of this amendment is that providers must consider freedom of expression and privacy issues when deciding on measures and policies to comply with clause 14 (user empowerment). The reference to clause 14 replaces the previous reference to clause 13 (adults’ safety duties), which is now removed (see Amendment 7).

Question proposed, That the clause, as amended, stand part of the Bill.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss clause 30 stand part.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will speak broadly to clause 20, as it is an extremely important clause, before making remarks about the group of Government amendments we have just voted on.

Clause 20 is designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, as Labour has repeatedly argued, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulations in less substantive ways. That is a genuine fear here.

We all want to see a Bill in place that protects free speech, but that cannot come at the expense of safety online. The situation with regards to content that is harmful to adults has become even murkier with the Government’s attempts to water down the Bill and remove adult risk assessments entirely.

The Minister must acknowledge that there is a balance to be achieved. We all recognise that. The truth is—and this is something that his predecessor, or should I say his predecessor’s predecessor, touched on when we considered this clause in the previous Bill Committee—that at the moment platforms are extremely inconsistent in their approach to getting the balance right. Although Labour is broadly supportive of this clause and the group of amendments, we feel that now is an appropriate time to put on record our concerns over the important balance between safety, transparency and freedom of expression.

Labour has genuine concerns over the future of platforms’ commitment to retaining that balance, particularly if the behaviours following the recent takeover of Twitter by Elon Musk are anything to go by. Since Elon Musk took over ownership of the platform, he has repeatedly used Twitter polls, posted from his personal account, as metrics to determine public opinion on platform policy. The general amnesty policy and the reinstatement of Donald Trump both emerged from such polls.

According to former employees, those polls are not only inaccurate representations of the platform’s user base, but are actually

“designed to be spammed and gamed”.

The polls are magnets for bots and other inauthentic accounts. This approach and the reliance on polls have allowed Elon Musk to enact and dictate his platform’s policy on moderation and freedom of expression. Even if he is genuinely trusting the results of these polls and not gamifying them, they do not accurately represent the user base nor the best practices for confronting disinformation and harm online.

Elon Musk uses the results to claim that “the people have spoken”, but they have not. Research from leading anti-hate organisation the Anti-Defamation League shows that far-right extremists and neo-Nazis encouraged supporters to actively re-join Twitter to vote in these polls. The impacts of platforming neo-Nazis on Twitter do not need to be stated. Such users are explicitly trying to promote violent and hateful agendas, and they were banned initially for that exact reason. The bottom line is that those people were banned in line with Twitter’s terms of service at the time, and they should not be re-platformed just because of the findings of one Twitter poll.

These issues are at the very heart of Labour’s concerns in relation to the Bill—that the duties around freedom of expression and privacy will be different for those at the top of the platforms. We support the clause and the group of amendments, but I hope the Minister will be able to address those concerns in his remarks.

Paul Scully Portrait Paul Scully
- Hansard - -

I endorse the general approach set out by the hon. Member for Pontypridd. We do not want to define freedom of speech based on a personal poll carried out on one platform. That is exactly why we are enshrining it in this ground-breaking Bill.

We want to get the balance right. I have talked about the protections for children. We also want to protect adults and give them the power to understand the platforms they are on and the risks involved, while having regard for freedom of expression and privacy. That is a wider approach than one man’s Twitter feed. These clauses are important to ensure that the service providers interpret and implement their safety duties in a proportionate way that limits negative impact on users’ rights to freedom of expression. However, they also have to have regard to the wider definition of freedom of expression, while protecting users, which the rest of the Bill covers in a proportionate way.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

This goes to the heart of more than just one person’s Twitter feed, although we could say that that person is an incredibly powerful and influential figure on the platform. In the past 24 hours, Twitter has disbanded its trust and safety council. Members of that council included expert groups working to tackle harassment and child sexual exploitation, and to promote human rights. Does the Minister not feel that the council being disbanded goes to the heart of what we have been debating? It shows how a platform can remove its terms of service or change them at whim in order to prevent harm from being perpetrated on that platform.

Paul Scully Portrait Paul Scully
- Hansard - -

I will come back to some of the earlier points. At the end of the day, when platforms change their terms and conditions, which they are free to do, they will be judged by their users and indeed the advertisers from whom they make their money. There are market forces—I will use that phrase as well as “commercial imperative”, to get that one in there—that will drive behaviour. It may be the usability of Facebook, or Twitter’s terms and conditions and the approach of its new owner, that will drive those platforms to alternative users. I am old enough to remember Myspace, CompuServe and AOL, which tried to box people into their walled gardens. What happened to them? Only yesterday, someone from Google was saying that the new artificial intelligence chatbot—ChatGPT—may well disrupt Google. These companies, as big as they are, do not have a right to exist. They have to keep innovating. If they get it wrong, then they get it wrong.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Does my hon. Friend agree that this is why the Bill is structured in the way it is? We have a wide range of priority illegal offences that companies have to meet, so it is not down to Elon Musk to determine whether he has a policy on race hate. They have to meet the legal standards set, and that is why it is so important to have that wide range of priority illegal offences. If companies go beyond that and have higher safety standards in their terms of service, that is checked as well. However, a company cannot avoid its obligations simply by changing its terms of service.

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend is absolutely right. We are putting in those protections, but we want companies to have due regard to freedom of speech.

I want to clarify a point that my hon. Friend made earlier about guidance to the new accountability, transparency and free speech duties. Companies will be free to set any terms of service that they want to, subject to their other legal obligations. That is related to the conversations that we have just been having. Those duties are there to properly enforce the terms of service, and not to remove content or ban users except in accordance with those terms. There will no platform risk assessments or codes of practices associated with those new duties. Instead, Ofcom will issue guidance on how companies can comply with their duties rather than codes of practice. That will focus on how companies set their terms of service, but companies will not be required to set terms directly for specific types of content or cover risks. I hope that is clear.

To answer the point made by the hon. Member for Pontypridd, I agree with the overall sentiment about how we need to protect freedom of expression.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I want to be clear on my point. My question was not related to how platforms set their terms of service, which is a matter for them and they are held to account for that. If we are now bringing in requirements to say that companies cannot go beyond terms of service or their duties in the Bill if they are going to moderate content, who will oversee that? Will Ofcom have a role in checking whether platforms are over-moderating, as the Minister referred to earlier? In that case, where those duties exist elsewhere in the Bill, we have codes of practice in place to make sure it is clear what companies should and should not do. We do not seem to be doing that with this issue.

Paul Scully Portrait Paul Scully
- Hansard - -

Absolutely. We have captured that in other parts of the Bill, but I wanted to make that specific bit clear because I am not sure whether I understood or answered my hon. Friend’s question correctly at the time.

Question put and agreed to.

Clause 20, as amended, accordingly ordered to stand part of the Bill.

Clause 21

Record-keeping and review duties

Amendments made: 32, in clause 21, page 23, line 5, leave out “, 10 or 12” and insert “or 10”.

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 33, in clause 21, page 23, line 45, leave out paragraph (c).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 34, in clause 21, page 24, line 6, leave out “section” and insert “sections”.

This amendment is consequential on Amendment 35.

Amendment 35, in clause 21, page 24, line 6, at end insert—

“, (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (duties about terms of service).”—(Paul Scully.)

This amendment ensures that providers have a duty to review compliance with the duties set out in NC3 and NC4 regularly, and after making any significant change to the design or operation of the service.

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Given that there are few changes to this clause from when the Bill was amended in the previous Public Bill Committee, I will be brief. We in the Opposition are clear that record-keeping and review duties on in-scope services make up an important function of the regulatory regime and sit at the very heart of the Online Safety Bill. We must push platforms to transparently report all harms identified and the action taken in response, in line with regulation.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Specifically on the issue that was just raised, there were two written ministerial statements on the Online Safety Bill. The first specifically said that an amendment would

“require the largest platforms to publish summaries of their risk assessments for illegal content and material that is harmful to children, to allow users and empower parents to clearly understand the risks presented by these services and the approach platforms are taking to children’s safety”.—[Official Report, 29 November 2022; Vol. 723, c. 31WS.]

Unless I have completely missed an amendment that has been tabled for this Committee, my impression is that that amendment will be tabled in the Lords and that details will be made available about how exactly the publishing will work and which platforms will be required to publish.

I would appreciate it if the Minister could provide more clarity about what that might look like, and about which platforms might have to publish their assessments. I appreciate that that will be scrutinised in the Lords but, to be fair, this is the second time that the Bill has been in Committee in the Commons. It would be helpful if we could be a bit more sighted on what exactly the Government intend to do—meaning more than the handful of lines in a written ministerial statement—because then we would know whether the proposal is adequate, or whether we would have to ask further questions in order to draw it out and ensure that it is published in a certain form. The more information the Minister can provide, the better.

Paul Scully Portrait Paul Scully
- Hansard - -

I think we all agree that written records are hugely important. They are important as evidence in cases where Ofcom is considering enforcement action, and a company’s compliance review should be done regularly, especially before they make changes to their service.

The Bill does not intend to place excessive burdens on small and low-risk businesses. As such, clause 21 provides Ofcom with the power to exempt certain types of service from the record-keeping and review duties. However, the details of any exemptions must be published.

To half-answer the point made by the hon. Member for Aberdeen North, the measures will be brought to the Lords, but I will endeavour to keep her up to date as best we can so that we can continue the conversation. We have served together on several Bill Committees, including on technical Bills that required us to spend several days in Committee—although they did not come back for re-committal—so I will endeavour to keep her and, indeed, the hon. Member for Pontypridd, up to date with developments.

Question put and agreed to. 

Clause 21, as amended, accordingly ordered to stand part of the Bill.

Clause 30

duties about freedom of expression and privacy

Amendments made: 36, in clause 30, page 31, line 31, after “have” insert “particular”.

This amendment has the result that providers of regulated search services must have particular regard to freedom of expression when deciding on and implementing safety measures and policies.

Amendment 37, in clause 30, page 31, line 34, after “have” insert “particular”.—(Paul Scully.)

This amendment has the result that providers of regulated search services must have particular regard to users’ privacy when deciding on and implementing safety measures and policies.

Clause 30, as amended, ordered to stand part of the Bill.

Clause 46

Relationship between duties and codes of practice

Amendments made: 38, in clause 46, page 44, line 27, after “have” insert “particular”.

This amendment has the result that providers of services who take measures other than those recommended in codes of practice in order to comply with safety duties must have particular regard to freedom of expression and users’ privacy.

Amendment 39, in clause 46, page 45, line 12, leave out paragraph (c).

This amendment is consequential on Amendment 7 (removal of clause 13).

Amendment 40, in clause 46, page 45, line 31, at end insert “, or

(ii) a duty set out in section 14 (user empowerment);”.—(Paul Scully.)

This amendment has the effect that measures recommended in codes of practice to comply with the duty in clause 14 are relevant to the question of whether a provider is complying with the duties in clause 20(2) and (3) (having regard to freedom of expression and users’ privacy).

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I do not wish to repeat myself and test the Committee’s patience, so I will keep my comments brief. As it stands, service providers would be treated as complying with their duties if they had followed the recommended measures set out in the relevant codes of practice, as set out in the Bill. However, providers could take alternative measures to comply, but as I said in previous Committee sittings, Labour remains concerned that the definition of “alternative measures” is far too broad. I would be grateful if the Minister elaborated on his assessment of the instances in which a service provider may seek to comply via alternative measures.

The codes of practice should be, for want of a better phrase, best practice. Labour is concerned that, to avoid the duties, providers may choose to take the “alternative measures” route as an easy way out. We agree that it is important to ensure that providers have a duty with regard to protecting users’ freedom of expression and personal privacy. As we have repeatedly said, the entire Online Safety Bill regime relies on that careful balance being at the forefront. We want to see safety at the forefront, but recognise the importance of freedom of expression and personal privacy, and it is right that those duties are central to the clause. For those reasons, Labour has not sought to amend this part of the Bill, but I want to press the Minister on exactly how he sees this route being used.

Paul Scully Portrait Paul Scully
- Hansard - -

It is important that service providers have flexibility, so that the Bill does not disincentivise innovation or force service providers to use measures that might not work for all business models or technological contexts. The tech sector is diverse and dynamic, and it is appropriate that companies can take innovative approaches to fulfilling their duties. In most circumstances, we expect companies to take the measures outlined in Ofcom’s code of practice as the easiest route to compliance. However, where a service provider takes alternative measures, Ofcom must consider whether those measures safeguard users’ privacy and freedom of expression appropriately. Ofcom must also consider whether they extend across all relevant areas of a service mentioned in the illegal content and children’s online safety duties, such as content moderation, staff policies and practices, design of functionalities, algorithms and other features. Ultimately, it will be for Ofcom to determine a company’s compliance with the duties, which are there to ensure users’ safety.

Question put and agreed to.

Clause 46, as amended, accordingly ordered to stand part of the Bill.

Clause 55 disagreed to.

Clause 56

Regulations under sections 54 and 55

Amendments made: 42, in clause 56, page 54, line 40, leave out subsection (3).

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 43, in clause 56, page 54, line 46, leave out “or 55”.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 44, in clause 56, page 55, line 8, leave out “or 55”.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 45, in clause 56, page 55, line 9, leave out

“or adults are to children or adults”

and insert “are to children”.—(Paul Scully.)

This amendment is consequential on Amendment 41 (removal of clause 55).

Question proposed, That the clause, as amended, stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

As we know, the clause makes provision in relation to the making of regulations designating primary and priority content that is harmful to children, and priority content that is harmful to adults. The Secretary of State may specify a description of content in regulations only if they consider that there is a material risk of significant harm to an appreciable number of children or adults in the United Kingdom presented by user-generated or search content of that description, and must consult Ofcom before making such regulations.

In the last Bill Committee, Labour raised concerns that there were no duties that required the Secretary of State to consult others, including expert stakeholders, ahead of making these regulations. That decision cannot be for one person alone. When it comes to managing harmful content, unlike illegal content, we can all agree that it is about implementing systems that prevent people from encountering it, rather than removing it entirely.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I completely agree: we are now on our third Secretary of State, our third Minister and our third Prime Minister since we began considering this iteration of the Bill. It is vital that this does not come down to one person’s ideological beliefs. We have spoken at length about this issue; the hon. Member for Don Valley has spoken about his concerns that Parliament should be sovereign, and should make these decisions. It should not be for one individual or one stakeholder to make these determinations.

We also have issues with the Government’s chosen toggle approach—we see that as problematic. We have debated it at length, but our concerns regarding clause 56 are about the lack of consultation that the Secretary of State of the day, whoever that may be and whatever political party they belong to, will be forced to make before making widespread changes to a regime. I am afraid that those concerns still exist, and are not just held by us, but by stakeholders and by Members of all political persuasions across the House. However, since our proposed amendment was voted down in the previous Bill Committee, nothing has changed. I will spare colleagues from once again hearing my pleas about the importance of consultation when it comes to determining all things related to online safety, but while Labour Members do not formally oppose the clause, we hope that the Minister will address our widespread concerns about the powers of the Secretary of State in his remarks.

Paul Scully Portrait Paul Scully
- Hansard - -

I appreciate the hon. Lady’s remarks. We have tried to ensure that the Bill is proportionate, inasmuch as the Secretary of State can designate content if there is material risk of significant harm to an appreciable number of children in the United Kingdom. The Bill also requires the Secretary of State to consult Ofcom before making regulations on the priority categories of harm.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I appreciate that this point has been made about the same wording earlier today, but I really feel that the ambiguity of “appreciable number” is something that could do with being ironed out. The ambiguity and vagueness of that wording make it very difficult to enforce the provision. Does the Minister agree that “appreciable number” is too vague to be of real use in legislation such as this?

Paul Scully Portrait Paul Scully
- Hansard - -

The different platforms, approaches and conditions will necessitate different numbers; it would be hard to pin a number down. The wording is vague and wide-ranging because it is trying to capture any number of scenarios, many as yet unknown. However, the regulations designating priority harms will be made under the draft affirmative resolution procedure.

Sarah Owen Portrait Sarah Owen
- Hansard - - - Excerpts

On that point, which we discussed earlier—my hon. Friend the Member for Warrington North discussed it—I am struggling to understand what is an acceptable level of harm, and what is the acceptable number of people to be harmed, before a platform has to act.

Paul Scully Portrait Paul Scully
- Hansard - -

It totally depends on the scenario. It is very difficult for me to stand here now and give a wide number of examples, but the Secretary of State will be reacting to a given situation, rather than trying to predict them.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister has just outlined exactly what our concerns are. He is unable to give an exact number, figure or issue, but that is what the Secretary of State will have to do, without having to consult any stakeholders regarding that issue. There are many eyes on us around the world, with other legislatures looking at us and following suit, so we want the Bill to be world-leading. Many Governments across the world may deem that homosexuality, for example, is of harm to children. Because this piece of legislation creates precedent, a Secretary of State in such a Government could determine that any platform in that country should take down all that content. Does the Minister not see our concerns in that scenario?

Paul Scully Portrait Paul Scully
- Hansard - -

I was about to come on to the fact that the Secretary of State would be required to consult Ofcom before making regulations on the priority categories of harm. Indeed Ofcom, just like the Secretary of State, speaks to and engages with a number of stakeholders on this issue to gain a deeper understanding. Regulations designating priority harms would be made under the draft affirmative resolution procedure, but there is also provision for the Secretary of State to use the made affirmative resolution procedure in urgent scenarios, and this would be an urgent scenario. It is about getting the balance right.

--- Later in debate ---
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

That concern would be triggered by Ofcom discovering things as a consequence of user complaint. Although Ofcom is not a complaint resolution company, users can complain to it. Independent academics and researchers may produce studies and reports highlighting problems at any time, so Ofcom does not have to wait through an annual cycle of transparency reporting. At any time, Ofcom can say, “We want to have a deeper look at this problem.” It could be something Ofcom or someone else has discovered, and Ofcom can either research that itself or appoint an outside expert.

As the hon. Member for Warrington North mentioned, very sensitive information might become apparent through the transparency reporting that one might not necessarily wish to make public because it requires further investigation and could highlight a particular flaw that could be exploited by bad actors. I would hope and expect, as I think we all would, that we would have the routine publication of transparency reporting to give people assurance that the platforms are meeting their obligations. Indeed, if Ofcom were to intervene against a platform, it would probably use information gathered and received to provide the rationale for why a fine has been issued or another intervention has been made. I am sure that Ofcom will draw all the time on information gathered through transparency reporting and, where relevant, share it.

Paul Scully Portrait Paul Scully
- Hansard - -

This has been a helpful debate. Everyone was right that transparency must be and is at the heart of the Bill. From when we were talking earlier today about how risk assessments and terms of service must be accessible to all, through to this transparency reporting section, it is important that we hold companies to account and that the reports play a key role in allowing users, Ofcom and civil society, including those in academia, to understand the steps that companies are taking to protect users.

Under clause 65, category 1 services, category 2A search services and category 2B user-to-user services need to publish transparency reports annually in accordance with the transparency report notice from Ofcom. That relates to the points about commerciality that my hon. Friend the Member for Folkestone and Hythe talked about. Ofcom will set out what information is required from companies in their notice, which will also specify the format, manner and deadline for the information to be provided to Ofcom. Clearly, it would not be proportionate to require every service provider within the scope of the overall regulatory framework to produce a transparency report—it is also important that we deal with capacity and proportionality—but those category threshold conditions will ensure that the framework is flexible and future-proofed.

Charlotte Nichols Portrait Charlotte Nichols
- Hansard - - - Excerpts

I note what the Minister said about the commercial implications of some of these things, and some of those commercial implications might act as levers to push companies to do better on some things. By that same token, should this information not be more transparent and publicly available to give the user the choice he referred to earlier? That would mean that if a user’s data was not being properly protected and these companies were not taking the measures around safety that the public would expect, users can vote with their feet and go to a different platform. Surely that underpins a lot of what we have been talking about.

Paul Scully Portrait Paul Scully
- Hansard - -

Yes, and that is why Ofcom will be the one that decides which information should be published, and from whom, to ensure that it is proportionate. At the end of the day, I have talked about the fact that transparency is at the heart of the Bill and that the transparency reports are important. To go to the original point raised by the hon. Member for Pontypridd about when these reports will be published, they will indeed be published in accordance with subsection 3(d) of the clause.

Question put and agreed to.

Clause 65 accordingly ordered to stand part of the Bill.

Schedule 8

Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services

Amendments made: 61, in schedule 8, page 203, line 13, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 62, in schedule 8, page 203, line 15, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 63, in schedule 8, page 203, line 17, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 64, in schedule 8, page 203, line 21, leave out from “or” to end of line 23 and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about user reporting of content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 65, in schedule 8, page 203, line 25, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 66, in schedule 8, page 203, line 29, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 67, in schedule 8, page 203, line 41, at end insert—

“11A Measures taken or in use by a provider to comply with any duty set out in section (Duty not to act against users except in accordance with terms of service) or (Further duties about terms of service) (terms of service).”

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about measures taken to comply with the new duties imposed by NC3 and NC4.

Amendment 68, in schedule 8, page 204, line 2, leave out from “illegal content” to end of line 3 and insert

“or content that is harmful to children—”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 69, in schedule 8, page 204, line 10, leave out from “illegal content” to “, and” in line 12 and insert

“and content that is harmful to children”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 70, in schedule 8, page 204, line 14, leave out from “illegal content” to “present” in line 15 and insert

“and content that is harmful to children”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 71, in schedule 8, page 205, line 38, after “Part 3” insert

“or Chapters 1 to 2A of Part 4”.—(Paul Scully.)

This amendment requires OFCOM, in considering which information to require from a provider in a transparency report, to consider whether the provider is subject to the duties imposed by Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6 (and Chapter 1 of Part 4).

Paul Scully Portrait Paul Scully
- Hansard - -

I beg to move amendment 72, in schedule 8, page 206, line 5, at end insert—

“35A (1) For the purposes of this Schedule, content of a particular kind is ‘relevant content’ if—

(a) a term of service, other than a term of service within sub-paragraph (2), states that a provider may or will take down content of that kind from the service or restrict users’ access to content of that kind, and

(b) it is regulated user-generated content.

(2) The terms of service within this sub-paragraph are as follows—

(a) terms of service which make provision of the kind mentioned in section 9(5) (protecting individuals from illegal content) or 11(5) (protecting children from content that is harmful to children);

(b) terms of service which deal with the treatment of consumer content.

(3) References in this Schedule to relevant content are to content that is relevant content in relation to the service in question.”

This amendment defines “relevant content” for the purposes of Schedule 8.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 73 and 75.

Paul Scully Portrait Paul Scully
- Hansard - -

The amendments to schedule 8 confirm that references to relevant content, consumer content and regulated user-generated content have the same meaning as established by other provisions of the Bill. Again, that ensures consistency, which will, in turn, support Ofcom in requiring providers of category 1 services to give details in their annual transparency reports of their compliance with the new transparency, accountability and freedom of expression duties.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I will keep my comments on this grouping brief, because I have already raised our concerns and our overarching priority in terms of transparency reports in the previous debate, which was good one, with all Members highlighting the need for transparency and reporting in the Bill. With the Chair’s permission, I will make some brief comments on Government amendment 72 before addressing Government amendments 73 and 75.

It will come as no surprise to the Minister that amendment 72, which defines relevant content for the purposes of schedule 8, has a key omission—specifying priority content harmful to adults. For reasons we have covered at length, we think that it is a gross mistake on the Government’s side to attempt to water down the Bill in this way. If the Minister is serious about keeping adults safe online, he must reconsider this approach. However, we are happy to see amendments 73 and 75, which define consumer content and regulated user-generated content. It is important for all of us—whether we are politicians, researchers, academics, civil society, stakeholders, platforms, users or anyone else—that these definitions are in the Bill so that, when it is passed, it can be applied properly and at pace. That is why we have not sought to amend this grouping.

I must press the Minister to respond on the issues around relevant content as outlined in amendment 72. We greatly feel that more needs to be done to address this type of content and its harm to adults, so I would be grateful to hear the Minister’s assessment of how exactly these transparency reports will report back on this type of harm, given its absence in this group of amendments and the lack of a definition.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am pleased to see the list included and the number of things that Ofcom can ask for more information on. I have a specific question about amendment 75. Amendment 75 talks about regulated user-generated content and says it has the same meaning as it does in the interpretation of part 3 under clause 50. The Minister may or may not know that there are concerns about clause 50(5), which relates to

“One-to-one live aural communications”.

One-to-one live aural communications are exempted. I understand that that is because the Government do not believe that telephony services, for example, should be part of the Online Safety Bill—that is a pretty reasonable position for them to take. However, allowing one-to-one live aural communications not to be regulated means that if someone is using voice chat in Fortnite, for example, and there are only two people on the team that they are on, or if someone is using voice chat in Discord and there are only two people online on the channel at that time, that is completely unregulated and not taken into account by the Bill.

I know that that is not the intention of the Bill, which is intended to cover user-generated content online. The exemption is purely in place for telephony services, but it is far wider than the Government intend it to be. With the advent of more and more people using virtual reality technology, for example, we will have more and more aural communication between just two people, and that needs to be regulated by the Bill. We cannot just allow a free-for-all.

If we have child protection duties, for example, they need to apply to all user-generated content and not exempt it specifically because it is a live, one-to-one aural communication. Children are still at significant risk from this type of communication. The Government have put this exemption in because they consider such communication to be analogous to telephony services, but it is not. It is analogous to telephony services if we are talking about a voice call on Skype, WhatsApp or Signal—those are voice calls, just like telephone services—but we are talking about a voice chat that people can have with people who they do not know, whose phone number they do not know and who they have no sort of relationship with.

Some of the Discord servers are pretty horrendous, and some of the channels are created by social media influencers or people who have pretty extreme views in some cases. We could end up with a case where the Discord server and its chat functions are regulated, but if aural communication or a voice chat is happening on that server, and there are only two people online because it is 3 o’clock in the morning where most of the people live and lots of them are asleep, that would be exempted. That is not the intention of the Bill, but the Government have not yet fixed this. So I will make one more plea to the Government: will they please fix this unintended loophole, so that it does not exist? It is difficult to do, but it needs to be done, and I would appreciate it if the Minister could take that into consideration.

Paul Scully Portrait Paul Scully
- Hansard - -

I do not believe that the provisions in terms of Ofcom’s transparency powers have been watered down. It is really important that the Bill’s protection for adults strikes the right balance with its protections for free speech, which is why we have replaced the “legal but harmful” clause. I know we will not agree on that, but there are more new duties that will make platforms more accountable. Ofcom’s transparency powers will enable it to assess compliance with the new safety duties and hold platforms accountable for enforcing their terms of service to keep users safe. Companies will also have to report on the measures that they have in place to tackle illegal content or activity and content that is harmful for children, which includes proactive steps to address offences such as child sexual exploitation and abuse.

The legislation will set out high-level categories of information that companies may be required to include in their transparency reports, and Ofcom will then specify the information that service providers will need to include in those reports, in the form of a notice. Ofcom will consider companies’ resources and capacity, service type and audience in determining what information they will need to include. It is likely that the information that is most useful to the regulator and to users will vary between different services. To ensure that the transparency framework is proportionate and reflects the diversity of services in scope, the transparency reporting requirements set out in the Ofcom notice are likely to differ between those services, and the Secretary of State will have powers to update the list of information that Ofcom may require to reflect any changes of approach.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The in-game chat that children use is overwhelmingly voice chat. Children do not type if they can possibly avoid it. I am sure that that is not the case for all children, but it is for most children. Aural communication is used if someone is playing Fortnite duos, for example, with somebody they do not know. That is why that needs to be included.

Paul Scully Portrait Paul Scully
- Hansard - -

I very much get that point. It is not something that I do, but I have certainly seen it myself. I am happy to chat to the hon. Lady to ensure that we get it right.

Amendment 72 agreed to.

Amendments made: 73, in schedule 8, page 206, line 6, at end insert—

“‘consumer content’ has the same meaning as in Chapter 2A of Part 4 (see section (Interpretation of this Chapter)(3));”.

This amendment defines “consumer content” for the purposes of Schedule 8.

Amendment 74, in schedule 8, page 206, leave out lines 7 and 8.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 75, in schedule 8, page 206, line 12, at end insert—

“‘regulated user-generated content’ has the same meaning as in Part 3 (see section 50), and references to such content are to content that is regulated user-generated content in relation to the service in question;”.—(Paul Scully.)

This amendment defines “regulated user-generated content” for the purposes of Schedule 8.

Schedule 8, as amended, agreed to.

Ordered, That further consideration be now adjourned. —(Mike Wood.)

ONLINE SAFETY BILL (Third sitting)

Paul Scully Excerpts
Committee stage (re-committed clauses and schedules)
Thursday 15th December 2022

(1 year, 11 months ago)

Public Bill Committees
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Public Bill Committee Amendments as at 15 December 2022 - (15 Dec 2022)
Meaning of threshold conditions etc
Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- Hansard - -

I beg to move amendment 48, in clause 82, page 72, line 21, at end insert—

“(ca) a regulated user-to-user service meets the conditions in section (List of emerging Category 1 services)(2) if those conditions are met in relation to the user-to-user part of the service;”.

This is a technical amendment ensuring that references to user-to-user services in the new clause inserted by NC7 relate to the user-to-user part of the service.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendment 49.

Government new clause 7—List of emerging Category 1 services.

Paul Scully Portrait Paul Scully
- Hansard - -

These Government amendments confer a duty on Ofcom to create and publish a list of companies that are approaching the category 1 threshold to ensure that it proactively identifies emerging high-reach, high-influence companies and is ready to add them to the category 1 register without delay. That is being done in recognition of the rapid pace of change in the tech industry, in which companies can grow quickly. The changes mean that Ofcom can designate companies as category 1 at pace. That responds to concerns that platforms could be unexpectedly popular and quickly grow in size, and that there could be delays in capturing them as category 1 platforms. Amendments 48 and 49 are consequential on new clause 7, which confers a duty on Ofcom to create and publish a list of companies that are approaching the category 1 threshold. For those reasons, I recommend that the amendments be accepted.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

It will come as no surprise to Members to hear that we have serious concerns about the system of categorisation and the threshold conditions for platforms and service providers, given our long-standing view that the approach taken is far too inflexible.

In previous sittings, we raised the concern that the Government have not provided enough clarity about what will happen if a service is required to shift from one category to another, and how long that will take. We remain unclear about that, about how shifting categories will work in practice, and about how long Ofcom will have to preside over such changes and decisions.

I have been following this Bill closely for just over a year, and I recognise that the online space is constantly changing and evolving. New technologies are popping up that will make this categorisation process even more difficult. The Government must know that their approach does not capture smaller, high-harm platforms, which we know—we have debated this several times—can be at the root of some of the most dangerous and harmful content out there. Will the Minister clarify whether the Government amendments will allow Ofcom to consider adding such small, high-harm platforms to category 1, given the risk of harm?

More broadly, we are pleased that the Government tabled new clause 7, which will require Ofcom to prepare and update a list of regulated user-to-user services that have 75% of the number of users of a category 1 service, and at least one functionality of a category 1 service, or one required combination of a functionality and another characteristic or factor of a category 1 service. It is absolutely vital that Ofcom, as the regulator, is sufficiently prepared, and that there is monitoring of regulated user-to-user services so that this regime is as flexible as possible and able to cope with the rapid changes in the online space. That is why the Opposition support new clause 7 and have not sought to amend it. Moreover, we also support Government amendments 48 and 49, which are technical amendments to ensure that new clause 7 references user-to-user services and assessments of those services appropriately. I want to press the Minister on how he thinks these categories will work, and on Ofcom’s role in that.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I agree with everything that the hon. Lady said. New clause 7 is important. It was missing from the earlier iterations of the Bill, and it makes sense to have it here, but it raises further concerns about the number of people who are required to use a service before it is classed as category 1. We will come later to our amendment 104 to schedule 11, which is about adding high-risk platforms to the categorisation.

I am still concerned that the numbers are a pretty blunt instrument for categorising something as category 1. The number may end up being particularly high. I think it would be very easy for the number to be wrong—for it to be too high or too low, and probably too high rather than too low.

If Twitter were to disappear, which, given the changing nature of the online world, is not outside the realms of possibility, we could see a significant number of other platforms picking up the slack. A lot of them might have fewer users, but the same level of risk as platforms such as Twitter and Facebook. I am still concerned that choosing a number is a very difficult thing to get right, and I am not totally convinced that the Government’s way of going about this is right.

Paul Scully Portrait Paul Scully
- Hansard - -

Ofcom will assess services that are close to meeting the threshold conditions of category 1 services and will publish a publicly available list of those emerging high-risk services. A service would have to meet two conditions to be added to the emerging services list: it would need at least 75% of the number of user figures in any category 1 threshold condition, and at least one functionality of a category 1 threshold condition, or one specified combination of a functionality and a characteristic or factor of a category 1 threshold condition.

Ofcom will monitor the emergence of new services. If it becomes apparent that a service has grown sufficiently to meet the threshold of becoming a category 1 service, Ofcom will be required to add that service to the register. The new clause and the consequential amendments take into account the possibility of quick growth.

Following the removal of “legal but harmful” duties, category 1 services will be subject to new transparency, accountability and free speech duties, as well as duties relating to protection for journalists and democratic content. Requiring all companies to comply with that full range of category 1 duties would pose a disproportionate regulatory burden on smaller companies that do not exert the same influence on public discourse, and that would possibly divert those companies’ resources away from tackling vital tasks.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Will my hon. Friend confirm that the risk assessments for illegal content—the priority illegal offences; the worst kind of content—apply to all services, whether or not they are category 1?

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend is absolutely right. All companies will still have to tackle the risk assessment, and will have to remove illegal content. We are talking about the extra bits that could take a disproportionate amount of resource from core functions that we all want to see around child protection.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I would push the Minister further. He mentioned that there will not be an onus on companies to tackle the “legal but harmful” duty now that it has been stripped from the Bill, but we know that disinformation, particularly around elections in this country, is widespread on these high-harm platforms, and they will not be in scope of category 2. We have debated that at length. We have debated the time it could take Ofcom to act and put those platforms into category 1. Given the potential risk of harm to our democracy as a result, will the Minister press Ofcom to act swiftly in that regard? We cannot put that in the Bill now, but time is of the essence.

Paul Scully Portrait Paul Scully
- Hansard - -

Absolutely. The Department has techniques for dealing with misinformation and disinformation as well, but we will absolutely push Ofcom to work as quickly as possible. As my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), the former Secretary of State, has said, once an election is done, it is done and it cannot be undone.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Could the Minister also confirm that the provisions of the National Security Bill read across to the Online Safety Bill? Where disinformation is disseminated by networks operated by hostile foreign states, particularly Russia, as has often been the case, that is still in scope. That will still require a risk assessment for all platforms, whether or not they are category 1.

Paul Scully Portrait Paul Scully
- Hansard - -

Indeed. We need to take a wide-ranging, holistic view of disinformation and misinformation, especially around election times. There is a suite of measures available to us, but it is still worth pushing Ofcom to make sure that it works as quickly as possible.

Amendment 48 agreed to.

Amendment made: 49, in clause 82, page 72, line 23, after “conditions” insert

“or the conditions in section (List of emerging Category 1 services)(2)”.—(Paul Scully.)

This is a technical amendment ensuring that references to assessments of user-to-user services in the new clause inserted by NC7 relate to the user-to-user part of the service.

Clause 82, as amended, ordered to stand part of the Bill.

Schedule 11

Categories of regulated user-to-user services and regulated search services: regulations

Paul Scully Portrait Paul Scully
- Hansard - -

I beg to move amendment 76, in schedule 11, page 213, line 11, at end insert

“, and

(c) any other characteristics of that part of the service or factors relating to that part of the service that the Secretary of State considers relevant.”

This amendment provides that regulations specifying Category 1 threshold conditions for the user-to-user part of regulated user-to-user services must also include conditions relating to any other characteristics of that part of the service or factors relating to that part of the service that the Secretary of State considers relevant.

None Portrait The Chair
- Hansard -

With this, it will be convenient to discuss Government amendments 77 to 79, 81 to 84, 86 to 91 and 93.

Paul Scully Portrait Paul Scully
- Hansard - -

These Government amendments seek to change the approach to category 1 designation, following the removal from the Bill of the adult safety duties and the concept of “legal but harmful” content. Through the proposed new duties on category 1 services, we aim to hold companies accountable to their terms of service, as we have said. I seek to remove all requirements on category 1 services relating to harmful content, so it is no longer appropriate to designate them with reference to harm. Consequently, the amendments in this group change the approach to designating category 1 services, to ensure that only the largest companies with the greatest influence over public discourse are designated as category 1 services.

Specifically, these amendments will ensure that category 1 services are so designated where they have functionalities that enable easy, quick and wide dissemination of user-generated content, and the requirement of category 1 services to meet a number of users threshold remains unchanged.

The amendments also give the Secretary of State the flexibility to consider other characteristics of services, as well as other relevant factors. Those characteristics might include a service’s functionalities, the user base, the business model, governance, and other systems and processes. That gives the designation process greater flexibility to ensure that services are designated category 1 services only when they have significant influence over public discourse.

The amendments also seek to remove the use of criteria for content that is harmful to adults from category 2B, and we have made a series of consequential amendments to the designation process for categories 2A and 2B to ensure consistency.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I have commented extensively on the flaws in the categorisation process in this and previous Committees, so I will not retread old ground. I accept the amendments in this grouping. They show that the Government are prepared to broaden the criteria for selecting which companies are likely to be in category 1. That is a very welcome, if not subtle, shift in the right direction.

The amendments bring the characteristics of a company’s service into consideration, which will be a slight improvement on the previous focus on size and functionality, so we welcome them. The distinction is important, because size and functionality alone are obviously very vague indicators of harm, or the threat of harm.

We are pleased to see that the Government have allowed for a list to be drawn up of companies that are close to the margins of category 1, or that are emerging as category 1 companies. This is a positive step for regulatory certainty, and I hope that the Minister will elaborate on exactly how the assessment will be made.

However, I draw the Minister’s attention to Labour’s long-held concern about the Bill’s over-reliance on powers afforded to the Secretary of State of the day. We debated this concern in a previous sitting. I press the Minister again on why these amendments, and the regulations around the threshold conditions, are ultimately only for the Secretary of State to consider, depending on characteristics or factors that only he or she, whoever they may be, deems relevant.

We appreciate that the regulations need some flexibility, but we have genuine concerns—indeed, colleagues from all parties have expressed such concerns—that the Bill will give the Secretary of State far too much power to determine how the entire online safety regime is imposed. I ask the Minister to give the Committee an example of a situation in which it would be appropriate for the Secretary of State to make such changes without any consultation with stakeholders or the House.

It is absolutely key for all of us that transparency should lie at the heart of the Bill. Once again, we fear that the amendments are a subtle attempt by the Government to impose on what is supposed to be an independent regulatory process the whim of one person. I would appreciate assurance on that point. The Minister knows that these concerns have long been held by me and colleagues from all parties, and we are not alone in those concerns. Civil society groups are also calling for clarity on exactly how decisions will be made, and particularly on what information will be used to determine a threshold. For example, do the Government plan on quantifying a user base, and will the Minister explain how the regime would work in practice, when we know that a platform’s user base can fluctuate rapidly? We have seen that already with Mastodon; the latter’s users have increased incredibly as a result of Elon Musk’s takeover of Twitter. I hope that the Minister can reassure me about those concerns. He will know that this is a point of contention for colleagues from across the House, and we want to get the Bill right before we progress to Report.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

My understanding is that only a very small number of platforms will reach the category 1 threshold. We are talking about the platforms that everybody has heard of—Facebook, Twitter and so on—and not about the slightly smaller platforms that lots of people have heard of and use. We are probably not talking about platforms such as Twitch, which has a much smaller user base than Facebook and Twitter but has a massive reach. My concern continues to be that the number threshold does not take into account the significant risks of harm from some of those platforms.

I have a specific question about amendment 76. I agree with my Labour Front-Bench colleague, the hon. Member for Pontypridd, that it shows that the Government are willing to take into account other factors. However, I am concerned that the Secretary of State is somehow being seen as the arbiter of knowledge—the person who is best placed to make the decisions—when much more flexibility could have been given to Ofcom instead. From all the evidence I have heard and all the people I have spoken to, Ofcom seems much more expert in dealing with what is happening today than any Secretary of State could ever hope to be. There is no suggestion about how the Secretary of State will consult, get information and make decisions on how to change the threshold conditions.

It is important that other characteristics that may not relate to functionalities are included if we discover that there is an issue with them. For example, I have mentioned livestreaming on a number of occasions in Committee, and we know that livestreaming is inherently incredibly risky. The Secretary of State could designate livestreaming as a high-risk functionality, and it could be included, for example, in category 1. I do not know whether it will be, but we know that there are risks there. How will the Secretary of State get that information?

There is no agreement to set up a user advocacy board. The requirement for Ofcom to consult the Children’s Commissioner will be brought in later, but organisations such as the National Society for the Prevention of Cruelty to Children, which deals with phone calls from children asking for help, are most aware of emerging threats. My concern is that the Secretary of State cannot possibly be close enough to the issue to make decisions, unless they are required to consult and listen to organisations that are at the coal face and that regularly support people. I shall go into more detail about high-harm platforms when we come to amendment 104.

Paul Scully Portrait Paul Scully
- Hansard - -

The amendments give the Secretary of State the flexibility to consider other characteristics of services as well as other relevant factors, which include functionalities, user base, business model, governance, and other systems and processes. They effectively introduce greater flexibility into the designation process, so that category 1 services are designated only if they have significant influence over public discourse. Although the Secretary of State will make the regulations, Ofcom will carry out the objective and evidence-based process, which will be subject to parliamentary scrutiny via statutory instruments. The Secretary of State will have due consultation with Ofcom at every stage, but to ensure flexibility and the ability to move fast, it is important that the Secretary of State has those powers.

Amendment 76 agreed to.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 104, in schedule 11, page 213, line 11, at end insert—

“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”

This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.

I would say this, but I think that this is the most important amendment. The key area that the Government are getting wrong is the way in which platforms, providers or services will be categorised. The threshold is based on the number of users. It is the number of users “and” one of those other things, not the number of users “or” one of those other things; even that would make a significant difference.

The Secretary of State talked about the places that have a significant influence over public discourse. It is perfectly possible to have a significant influence over public discourse with a small number of users, or with a number of users that does not number into the millions. We have seen the spread of conspiracy theories that have originated and been perpetuated on very small platforms—very small, shady places on the internet that none of us has experienced or even heard of. Those are the places that have a massive impact and effect.

We know that one person can have a significant impact on the world and on people’s lives. We have heard about the physical harm that people can be incited to cause by the platforms they access, and the radicalisation and extremism they find themselves subject to. That can cause massive, damaging effects to anybody they choose to take physical action against, and to some of the most marginalised communities and groups in society. We are seeing an increase in the amount of hate crime and the number of people who believe conspiracy theories, and not all of that is because of the spread of those things on Facebook and Twitter. It is because of the breadcrumbing and the spread that there can be on smaller platforms.

The most extreme views do not necessarily tip over into “illegal” or “incitement”; they do not actually say, “Please go out and kill everybody in this particular group.” They say, “This particular group is responsible for all of ills you feel and for every negative thing that is happening in your life”, and people are therefore driven to take extremist, terrorist action. That is a significant issue.

I want to talk about a couple of platforms. Kiwi Farms, which is no longer in existence and has been taken down, was a very small platform that dramatically damaged the lives of trans people in particular. It was a platform where people went to incite hatred and give out the addresses of folk who they knew were members of the trans community. Some of those people had to move to another continent to get away from the physical violence and attacks they faced as a result of the behaviour on that incredibly small platform, which very few people will have heard about.

Kiwi Farms has been taken down because the internet service providers decided that it was too extreme and they could not possibly host it any more. That was eventually recognised and change was made, but the influence that that small place had on lives—the difficulties and harm it caused—is untold. Some of that did tip over into illegality, but some did not.

I also want to talk about the places where there is a significant amount of pornography. I am not going to say that I have a problem with pornography online; the internet will always have pornography on it. It attracts a chunk of people to spend time online, and some of that pornography is on large mainstream sites. Searches for incest, underage girls, or black women being abused all get massive numbers of hits. There is a significant amount of pornography on these sites that is illegal, that pretends to be illegal or that acts against people with protected characteristics. Research has found that a significant proportion—significantly more than a half—of pornography on mainstream sites that involves black women also involves violence. That is completely and totally unacceptable, and has a massive negative impact on society, whereby it reinforces negativity and discrimination against groups that are already struggling with being discriminated against and that do not experience the privilege of a cis white man.

It is really grim that we are requiring a number of users to be specified, when we know the harm that caused by platforms that do not have 10 million or 20 million United Kingdom users. I do not know what the threshold will be, but I know it will be too high to include a lot of platforms that have a massive effect. The amendment is designed specifically to give Ofcom the power to designate as category 1 any service that it thinks has a very high risk of harm; I have not set the bar particularly low. Now that the Minister has increased the levels of transparency that will be required for category 1 platforms, it is even more important that we subject extremist sites and platforms—the radicalising ones, which are perpetuating discrimination—to a higher bar and require them to have the transparency that they need as a category 1 service. This is a place where the Bill could really make a difference and change lives, and I am really concerned that it is massively failing to do so.

The reason I have said that it should be Ofcom’s responsibility to designate category 1 services is on the basis that it has the experts who will be looking at all the risk assessments, dealing with companies on a day-to-day basis, and seeing the harms and transparencies that the rest of us will not be able to see. The reporting mechanisms will be public for only some of the category 1 platforms, and we will not be able to find out the level of information that Ofcom has, so it is right that it should be responsible for designating sites as having a very high risk of harm. That is why I tabled the amendment, which would make a massive difference to people who are the most discriminated against as it is and who are the most at risk of harm from extremism. I urge the Minister to think again.

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

As debated earlier, we are removing the adult safety duties from the Bill, which means that no company will face any duties related to legal but harmful content. In their place, the Government are introducing new transparency accountability, and free speech duties on category 1 services. They have been discussed in detail earlier this session.

It would not be proportionate to apply those new duties to smaller services, but, as we have heard from my hon. Friend the Member for Folkestone and Hythe, they will still have to comply with the illegal content and child safety duties if they are accessed by children. Those services have limited resources, and blanket applying additional duties on them would divert those resources away from complying with the illegal content and child safety duties. That would likely weaken the duties’ impact on tackling criminal activity and protecting children.

The new duties are about user choice and accountability on the largest platforms—if users do not want to use smaller harmful sites, they can choose not to—but, in recognition of the rapid pace with which companies can grow, I introduced an amendment earlier to create a watchlist of companies that are approaching the category 1 threshold, which will ensure that Ofcom can monitor rapidly scaling companies, reduce any delay in designating companies as category 1 services, and apply additional obligations on them.

The hon. Member for Aberdeen North talked about ISPs acting with respect to Kiwi Farms. I talked on Tuesday about the need for a holistic approach. There is not one silver bullet. It is important to look at Government, the platforms, parenting and ISPs, because that makes up a holistic view of how the internet works. It is the multi-stakeholder framework of governing the internet in its entirety, rather than the Government trying to do absolutely everything. We have talked a lot about illegality, and I think that a lot of the areas in that case were illegal; the hon. Lady described some very distasteful things. None the less, with the introduction of the watchlist, I do not believe amendment 104 is required.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The hon. Member for Folkestone and Hythe made a good point. I do not disagree that Ofcom will have a significant role in policing platforms that are below the category 1 threshold. I am sure it will be very hands on, particularly with platforms that have the highest risk and are causing the most harm.

I still do not think that is enough. I do not think that the Minister’s change with regard to emerging platforms should be based on user numbers. It is reasonable for us to require platforms that encourage extremism, spread conspiracy theories and have the most horrific pornography on them to meet a higher bar of transparency. I do not really care if they only have a handful of people working there. I am not fussed if they say, “Sorry, we can’t do this.” If they cannot keep people safe on their platform, they should have to meet a higher transparency bar, provide more information on how they are meeting their terms of service and provide toggles—all those things. It does not matter how small these platforms are. What matters is that they have massive risks and cause massive amounts of harm. It is completely reasonable that we hold them to a higher regulatory bar. On that basis, I will push the amendment to a vote.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Thank you, Dame Angela—take 2.

Clause 115 focuses on the enforcement action that may be taken and will be triggered if a platform fails to comply. Given that the enforceable requirements may include, for example, duties to carry out and report on risk assessments and general safety duties, it is a shame that the Government have not seen the merits of going further with these provisions. I point the Minister to the previous Public Bill Committee, where Labour made some sensible suggestions for how to remedy the situation. Throughout the passage of the Bill, we have made it abundantly clear that more access to, and availability of, data and information about systems and processes would improve understanding of the online environment.

We cannot and should not rely solely on Ofcom to act as problems arise when they could be spotted earlier by experts somewhere else. We have already heard the Minister outline the immense task that Ofcom has ahead of it to monitor risk assessments and platforms, ensuring that platforms comply and taking action where there is illegal content and a risk to children. It is important that Ofcom has at its disposal all the help it needs.

It would be helpful if there were more transparency about how the enforcement provisions work in practice. We have repeatedly heard that without independent researchers accessing data on relevant harm, platforms will have no real accountability for how they tackle online harm. I hope that the Minister can clarify why, once again, the Government have not seen the merit of encouraging transparency in their approach. It would be extremely valuable and helpful to both the online safety regime and the regulator as a whole, and it would add merit to the clause.

Paul Scully Portrait Paul Scully
- Hansard - -

We have talked about the fact that Ofcom will have robust enforcement powers. It can direct companies to take specific steps to come into compliance or to remedy failure to comply, as well as issue fines and apply to the courts for business disruption measures. Indeed, Ofcom can institute criminal proceedings against senior managers who are responsible for compliance with an information notice, when they have failed to take all reasonable steps to ensure the company’s compliance with that notice. That criminal offence will commence two months after Royal Assent.

Ofcom will be required to produce enforcement guidelines, as it does in other areas that it regulates, explaining how it proposes to use its enforcement powers. It is important that Ofcom is open and transparent, and that companies and people using the services understand exactly how to comply. Ofcom will provide those guidelines. People will be able to see who are the users of the services. The pre-emptive work will come from the risk assessments that platforms themselves will need to produce.

We will take a phased approach to bringing the duties under the Bill into effect. Ofcom’s initial focus will be on illegal content, so that the most serious harms can be addressed as soon as possible. When those codes of practice and guidelines come into effect, the hon. Member for Pontypridd will see some of the transparency and openness that she is looking for.

Question put and agreed to.

Clause 115, as amended, accordingly ordered to stand part of the Bill.

Clause 55

Review

Amendment made: 56, in clause 155, page 133, line 27, after “Chapter 1” insert “or 2A”.—(Paul Scully.)

Clause 155 is about a review by the Secretary of State of the regulatory framework established by this Bill. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am glad that there is a review function in the Bill. I have been a member of a lot of Bill Committees and Delegated Legislation Committees that have considered legislation that has no review function and that says, “This will be looked at in the normal course of departmental reviews.” We know that not all Departments always do such reviews. In fact, some Departments do under 50% of the reviews that they are supposed to do, and whether reviews take place is not checked. We therefore we do not find out whether a piece of legislation has had the intended effect. I am sure some will have done, but some definitely will not.

If the Government do not internally review whether a Bill or piece of delegated legislation has had the effect it was supposed to have, they cannot say whether it has been a success and cannot make informed decisions about future legislation, so having a review function in this Bill is really good. However, that function is insufficient as it is not enough for the Secretary of State to do the review and we will not see enough outputs from Ofcom.

The Bill has dominated the lives of a significant number of parliamentarians for the past year—longer, in some cases—because it is so important and because it has required so much scrutiny, thinking and information gathering to get to this stage. That work will not go away once the Bill is enacted. Things will not change or move at once, and parts of the legislation will not work as effectively as they could, as is the case for any legislation, whether moved by my Government or somebody else’s. In every piece of legislation there will be things that do not pan out as intended, but a review by the Secretary of State and information from Ofcom about how things are working do not seem to be enough.

Committee members, including those on the Government Benches, have suggested having a committee to undertake the review or adding that function to the responsibilities of the Digital, Culture, Media and Sport Committee. We know that the DCMS Committee is busy and will be looking into a significant number of wide-ranging topics, so it would be difficult for it to keep a watching brief on the Online Safety Bill.

The previous Minister said that there will be some sort of reviewing mechanism, but I would like further commitment from the Government that the Bill will be kept under review and that the review process as set out will not be the only type of review that happens as things move and change and the internet develops. Many people talk about more widespread use of virtual reality, for example, but there could be other things that we have not even heard of yet. After the legislation is implemented, it will be years before every part of the Bill is in action and every requirement in the legislation is working. By the time we get to 2027-28—or whenever every part of the legislation is working—things could have changed again and be drastically different to today. Indeed, the legislation may not be fit for purpose when it first starts to work, so will the Minister provide more information about what the review process will look like on an ongoing basis? The Government say this is world-leading legislation, but how we will ensure that that is the case and that it makes a difference to the safety and experience of both children and adults online?

Paul Scully Portrait Paul Scully
- Hansard - -

I am glad that we are all in agreement on the need for a review. It is important that we have a comprehensive and timely review of the regulatory regime and how it is built into legislation. It is important that we understand that the legislation has the impact that we intend.

The legislation clearly sets out what the review must consider, how Ofcom is carrying out its role and if the legislation is effective in dealing with child protection, which as the hon. Lady rightly says is its core purpose. We have struck the balance of specifying two to five years after the regime comes into force, because it provides a degree of flexibility to future Ministers to judge when it should happen. None the less, I take the hon. Lady’s point that technology is developing. That is why this is a front-footed first move in this legislation, when other countries are looking at what we are doing; because of that less prescriptive approach to technologies, the legislation can be flexible and adapt to emerging new technologies. Inevitably, this will not be the last word. Some of the things in the Digital Economy Act 2017, for example, are already out of date, as is some of the other legislation that was put in place in the early 2000s. We will inevitably come back to this, but I think we have the right balance at the moment in terms of the timing.

I do not think we need to bed in whom we consult, but wider consultation will none the less be necessary to ascertain the effectiveness of the legislation.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am following carefully what the Minister says, but I would say briefly that a lot of the debate we have had at all stages of the Bill has rested on how we believe Ofcom will use the powers it has been given, and we need to make sure that it does that. We need to ensure that it is effective and that it has the resources it needs. The hon. Member for Aberdeen North (Kirsty Blackman) makes an important point that it may not be enough to rely on a Select Committee of the Lords or the Commons having the time to do that in the detail we would want. We might need to consider either a post-legislative scrutiny Committee or some other mechanism to ensure that there is the necessary level of oversight.

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend is absolutely right. The report as is obviously has to be laid before Parliament and will form part of the package of parliamentary scrutiny. But, yes, we will consider how we can utilise the expertise of both Houses in post-legislative scrutiny. We will come back on that.

Question put and agreed to.

Clause 155, as amended, accordingly ordered to stand part of the Bill.

Clause 169

Individuals providing regulated services: liability

Amendment made: 57, in clause 169, page 143, line 15, at end insert—

“(fa) Chapter 2A of Part 4 (terms of service: transparency, accountability and freedom of expression);”.—(Paul Scully.)

Clause 169 is about liability of providers who are individuals. This amendment inserts a reference to Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6, so that individuals may be jointly and severally liable for the duties imposed by that Chapter.

Clause 169, as amended, ordered to stand part of the Bill.

Clause 183 ordered to stand part of the Bill.

Schedule 17

Video-sharing platform services: transitional provision etc

Amendments made: 94, in schedule 17, page 235, line 43, leave out paragraph (c).

This amendment is consequential on Amendment 6 (removal of clause 12).

Amendment 95, in schedule 17, page 236, line 27, at end insert—

“(da) the duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service) (terms of service);”.—(Paul Scully.)

This amendment ensures that services already regulated under Part 4B of the Communications Act 2003 (video-sharing platform services) are not required to comply with the new duties imposed by NC3 and NC4 during the transitional period.

Question proposed, That the schedule, as amended, be the Seventeenth schedule to the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour welcomes schedule 17, which the Government introduced on Report. We see this schedule as clarifying exactly how the existing video-sharing platform regime will be repealed and the transitional provisions that will apply to those providers as they transition to the online safety framework. The schedule is fundamentally important for both providers and users, as it establishes the formal requirements of these platforms as we move the requirement to this new legislation.

We welcome the clarification in paragraph 1(1) of the definition of a qualifying video-sharing service. On that point, I would be grateful if the Minister clarified the situation around livestreaming video platforms and whether this schedule would also apply to them. Throughout this Bill Committee, we have heard just how dangerous and harmful live video-sharing platforms can be, so this is an important point to clarify.

I have spoken at length about the importance of capturing the harms on these platforms, particularly in the context of child sexual exploitation being livestreamed online, which, thanks to the brilliant work of International Justice Mission, we know is a significant and widespread issue. I must make reference to the IJM’s findings from its recent White Paper, which highlighted the extent of the issue in the Philippines, which is widely recognised as a source country for livestreamed sexual exploitation of children. It found that traffickers often use cheap Android smartphones with pre-paid cellular data services to communicate with customers and produce and distribute explicit material. To reach the largest possible customer base, they often connect with sexually motivated offenders through everyday technology—the same platforms that the rest of us use to communicate with friends, family and co-workers.

One key issue in assessing the extent of online sexual exploitation of children is that we are entirely dependent on the detection of the crime, but the reality is that most current technologies that are widely used to detect various forms of online sexual exploitation of children are not designed to recognise livestreaming video services. This is an important and prolific issue, so I hope the Minister can assure me that the provisions in the schedule will apply to those platforms too.

Paul Scully Portrait Paul Scully
- Hansard - -

We are setting out in schedule 17 how the existing video-sharing platform regime will be repealed in the transitional provisions that apply to these providers as they transition to the online safety framework. My understanding is that it does include livestreaming, but I will obviously write to the hon. Lady if I have got that wrong. I am not sure there is a significant legal effect here. To protect children and treat services fairly while avoiding unnecessary burdens on business, we are maintaining the current user protections in the VSP regime while the online safety framework is being implemented. That approach to transition avoids the duplication of regulation.

Question put and agreed to.

Schedule 17, as amended, accordingly agreed to.

Clause 203

Interpretation: general

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move amendment 105, in clause 203, page 167, line 8, after “including” insert “but not limited to”.

This amendment makes clear that the definition provided for content is not exhaustive.

I am delighted that we have a new Minister, because I can make exactly the same speech as I made previously in Committee—don’t worry, I won’t—and he will not know.

I still have concerns about the definition of “content”. I appreciate that the Government have tried to include a number of things in the definition. It currently states:

“‘content’ means anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description”.

That is pretty wide-ranging, but I do not think it takes everything into account. I know that it uses the word “including”; it does not say “only limited to” or anything like that. If there is to be a list of stuff, it should be exhaustive. That is my idea of how the Bill should be.

I have suggested in amendment 105 that we add “not limited to” after “including” in order to be absolutely clear that the content that we are talking about includes anything. It may or may not be on this list. Something that is missing from the list is VR technology. If someone is using VR or immersive technology and is a character on the screen, they can see what the character is doing and move their body around as that character, and whatever they do is user-generated content. It is not explicitly included in the Bill, even though there is a list of things. I do not even know how that would be written down in any way that would make sense.

I have suggested adding “not limited to” to make it absolutely clear that this is not an exhaustive list of the things that could be considered to be user-generated content or content for the purposes of the Bill. It could be absolutely anything that is user-generated. If the Minister is able to make it absolutely clear that this is not an exhaustive list and that “content” could be anything that is user-generated, I will not press the amendment to a vote. I would be happy enough with that commitment.

Paul Scully Portrait Paul Scully
- Hansard - -

Indeed I can give that commitment. This is an indicative list, not an exhaustive list, for the reasons that the hon. Lady set out. Earlier, we discussed the fact that technology moves on, and she has come up with an interesting example. It is important to note that adding unnecessary words in legislation could lead to unforeseen outcomes when it is interpreted by courts, which is why we have taken this approach, but we think it does achieve the same thing.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that basis, I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendment proposed: 58, in clause 203, page 167, leave out lines 26 to 31. —(Paul Scully.)

This amendment removes the definition of the “maximum summary term for either-way offences”, as that term has been replaced by references to the general limit in a magistrates’ court.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I would like to ask the Minister why this amendment has been tabled. I am not entirely clear. Could he give us some explanation of the intention behind the amendment? I am pretty sure it will be fine but, if he could just let us know what it is for, that would be helpful.

Paul Scully Portrait Paul Scully
- Hansard - -

I am happy to do so. Clause 203 sets out the interpretation of the terms used throughout the Bill. Amendment 58 removes a definition that is no longer required because the term is no longer in the Bill. It is as simple as that. The definition of relevant crime penalties under the Bill now uses a definition that has been updated in the light of changes to sentencing power in magistrates courts set out in the Judicial Review And Courts Act 2022. The new definition of

“general limit in a magistrates court”

is now included in the Interpretation Act 1978, so no definition is required in this Bill.

Question put and agreed to.

Amendment 58 accordingly agreed to.

Amendment made: 59, in clause 203, page 168, line 48, at end insert—

“and references to restrictions on access to a service or to content are to be read accordingly.” —(Paul Scully.)

NC2 states what is meant by restricting users’ access to content, and this amendment makes it clear that the propositions in clause 203 about access read across to references about restricting access.

Question proposed, That the clause, as amended, stand part of the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Once again, I will abuse the privilege of having a different Minister at the Dispatch Box and mention the fact that, in the definitions, “oral communications” is mentioned in line 9 that we already mentioned in terms of the definition of “content”. It is “oral communications” in this part of the Bill but “aural communications” in an earlier part of the Bill. I am still baffled as to why there is a difference. Perhaps we should have both included in both of these sections or perhaps there should be some level of consistency throughout the Bill.

The “aural communications” section that I mentioned earlier in clause 50 is the one of the parts that I am particularly concerned about because it could create a loophole. That is a different spelling of the word. I asked this last time. I am not convinced that the answer I got gave me any more clarity than I had previously. I would be keen to understand why there is a difference, if the difference is intentional and what the difference therefore is between “oral” and “aural” communications in terms of the Bill. My understanding is that oral communications are ones that are said and aural communications are ones that are heard. But, for the purposes of the Bill, those two things are really the same, unless user-generated content in which there is user-generated oral communication that no one can possibly hear is included. That surely does not fit into the definitions, because user-generated content is only considered if it is user-to-user—something that other people can see. Surely, oral communication would also be aural communication. In pretty much every instance that the Bill could possibly apply to, both definitions would mean the same thing. I understand the Minister may not have the answer to this at his fingertips, and I would be happy to hear from him later if that would suit him better.

Paul Scully Portrait Paul Scully
- Hansard - -

The clause provides legal certainty about the meaning of those terms as used in the Bill: things such as “content”, “encounter”, “taking down” and “terms of service”. That is what the clause is intended to do. It is intentional and is for the reasons the hon. Lady said. Oral means speech and speech only. Aural is speech and other sounds, which is what can be heard on voice calls. That includes music as well. One is speech. The other is the whole gamut.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

I am intrigued, because the hon. Member for Aberdeen North makes an interesting point. It is not one I have heard made before. Does the Minister think there is a distinction between oral and aural, where oral is live speech and aural is pre-recorded material that might be played back? Are those two are considered distinct?

Paul Scully Portrait Paul Scully
- Hansard - -

My knowledge is being tested, so I will write to the hon. Member for Aberdeen North and make that available to the Committee. Coming back to the point she made about oral and aural on Tuesday about another clause on the exclusions, as I said, we have a narrow exemption to ensure that traditional phone calls are not subject to regulation. But that does mean that if a service such as Fortnite, which she spoke about previously, enables adults and children to have one-to-one oral calls, companies will still need to address the surrounding functionality around how that happens, because to enable that might cause harm—for example if an adult can contact an unknown child. That is still captured within the Bill.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Platforms will have to address, for example, the ways in which users can communicate with people who are not on their friends list. Things like that and other ways in which communication can be set up will have to be looked at in the risk assessment. With Discord, for instance, where two people can speak to each other, Discord will have to look at the way those people got into contact with each other and the risks associated with that, rather than the conversation itself, even though the conversation might be the only bit that involves illegality.

Paul Scully Portrait Paul Scully
- Hansard - -

It is the functionalities around it that enable the voice conversation to happen.

Question put and agreed to.

Clause 203, as amended, accordingly ordered to stand part of the Bill.

Clause 206

Extent

Question proposed, That the clause stand part of the Bill.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I would like to welcome the Government’s clarification, particularly as an MP representing a devolved nation within the UK. It is important to clarify the distinction between the jurisdictions, and I welcome that this clause does that.

Question put and agreed to.

Clause 206 accordingly ordered to stand part of the Bill.

Clause 207

Commencement and transitional provision

Amendment made: 60, in clause 207, page 173, line 15, leave out “to” and insert “and”.—(Paul Scully.)

This amendment is consequential on amendment 41 (removal of clause 55).

Question proposed, That the clause, as amended, stand part of the Bill.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Exactly. My hon. Friend makes an incredibly important point that goes to the heart of why we are here in the first place. If the platforms were not motivated by commercial interest and we could trust them to do the right thing on keeping children safe and reducing harm on their platforms, we would not require this legislation in the first place. But sadly, we are where we are, which is why it is even more imperative that we get on with the job, that Ofcom is given the tools to act swiftly and tries to reduce the limit of when they come into effect and that this legislation is enacted so that it actually makes a lasting difference.

Ofcom has already been responsible for regulating video-sharing platforms for two years, yet still, despite being in year 3, it is only asking websites to provide a plan as to how they will be compliant. That means the reality is that we can expect little on child protection before 2027-28, which creates a massive gap compared with public expectations of when the Bill will be passed. We raised these concerns last time, and I felt little assurance from the Minister in post last time, so I am wondering whether the current Minister can improve on his predecessor by ensuring a short timeline for when exactly the Bill can be implemented and Ofcom can act.

We all understand the need for the Bill, which my hon. Friend the Member for Warrington North just pointed out. That is why we have been supportive in Committee and throughout the passage of the Bill. But the measures that the Bill introduces must come into force as soon as is reasonably possible. Put simply, the industry is ready and users want to be protected online and are ready too. It is just the Government, sadly, and the regulator that would be potentially holding up implementation of the legislation.

The Minister has failed to concede on any of the issues that we have raised in Committee, despite being sympathetic and supportive. His predecessor was also incredibly supportive and sympathetic on everything we raised in Committee, yet failed to take into account a single amendment or issue that we raised. I therefore make a plea to this Minister to at least see the need to press matters and the timescale that is needed here. We have not sought to formally amend this clause, so I seek the Minister’s assurance that this legislation will be dealt with swiftly. I urge him to work with Labour, SNP colleagues and colleagues across the House to ensure that the legislation and the provisions in it are enacted and that there are no further unnecessary delays.

Paul Scully Portrait Paul Scully
- Hansard - -

Our intention is absolutely to get this regime operational as soon as possible after Royal Assent. We have to get to Royal Assent first, so I am looking forward to working with all parties in the other House to get the legislation to that point. After that, we have to ensure that the necessary preparations are completed effectively and that service providers understand exactly what is expected of them. To answer the point made by the hon. Member for Warrington North about service providers, the key difference from what happened in the years that led to this legislation being necessary is that they now will know exactly what is expected of them—and it is literally being expected of them, with legislation and with penalties coming down the line. They should not be needing to wait for the day one switch-on. They can be testing and working through things to ensure that the system does work on day one, but they can do that months earlier.

The legislation does require some activity that can be carried out only after Royal Assent, such as public consultation or laying of secondary legislation. The secondary legislation is important. We could have put more stuff in primary legislation, but that would belie the fact that we are trying to make this as flexible as possible, for the reasons that we have talked about. It is so that we do not have to keep coming back time and again for fear of this being out of date almost before we get to implementation in the first place.

However, we are doing things at the moment. Since November 2020, Ofcom has begun regulation of harmful content online through the video-sharing platform regulatory regime. In December 2020, Government published interim codes of practice on terrorist content and activity and sexual exploitation and abuse online. Those will help to bridge the gap until the regulator becomes operational. In June 2021, we published “safety by design” guidance, and information on a one-stop-shop for companies on protecting children online. In July 2021, we published the first Government online media literacy strategy. We do encourage stakeholders, users and families to engage with and help to promote that wealth of material to minimise online harms and the threat of misinformation and disinformation. But clearly, we all want this measure to be on the statute book and implemented as soon as possible. We have talked a lot about child protection, and that is the core of what we are trying to do here.   

     Question put and agreed to.

Clause 207, as amended, accordingly ordered to stand part of the Bill.

New Clause 1

OFCOM’s guidance: content that is harmful to children and user empowerment

“(1) OFCOM must produce guidance for providers of Part 3 services which contains examples of content or kinds of content that OFCOM consider to be, or consider not to be— OFCOM must produce guidance for providers of Category 1 services which contains examples of content or kinds of content that OFCOM consider to be, or consider not to be, content to which section 14(2) applies (see section 14(8A)).

(a) primary priority content that is harmful to children, or

(b) priority content that is harmful to children.

(2) Before producing any guidance under this section (including revised or replacement guidance), OFCOM must consult such persons as they consider appropriate.

(3) OFCOM must publish guidance under this section (and any revised or replacement guidance).”—(Paul Scully.)

This new clause requires OFCOM to give guidance to providers in relation to the kinds of content that OFCOM consider to be content that is harmful to children and content relevant to the duty in clause 14(2) (user empowerment).

Brought up, and read the First time.

Paul Scully Portrait Paul Scully
- Hansard - -

I beg to move, That the clause be read a Second time.

The Government are committed to empowering adults to have greater control over their online experience, and to protecting children from seeing harmful content online. New clause 1 places a new duty on Ofcom to produce and publish guidance for providers of user-to-user regulated services, in relation to the crucial aims of empowering adults and providers having effective systems and processes in place. The guidance will provide further clarity, including through

“examples of content or kinds of content that OFCOM consider to be…primary priority”

or

“priority content that is harmful to children.”

Ofcom will also have to produce guidance that sets out examples of content that it considers to be relevant to the user empowerment duties, as set out in amendment 15 to clause 14.

It is really important that expert opinion is considered in the development of this guidance, and the new clause places a duty on Ofcom to consult with relevant persons when producing sets of guidance. That will ensure that the views of subject matter experts are reflected appropriately.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour is pleased to see the introduction of the new clause, which clarifies the role of Ofcom in delivering guidance to providers about their duties. Specifically, the new clause will require Ofcom to give guidance to providers on the kind of content that Ofcom considers to be harmful to children, or relevant to the user empowerment duty in clause 14. That is a very welcome addition indeed.

Labour remains concerned about exactly how these so-called user empowerment tools will work in practice—we have discussed that at length—and let us face it: we have had little assurance from the Minister on that point. We welcome the new clause, as it clarifies what guidance providers can expect to receive from Ofcom once the Bill is finally enacted. We can all recognise that Ofcom has a colossal task ahead of it—the Minister said so himself—so it is particularly welcome that the guidance will be subject to consultation with those that it deems appropriate. I can hope only that that will include the experts, and the many groups that provided expertise, support and guidance on internet regulation long before the Bill even received its First Reading, a long time ago. There are far too many of those experts and groups to list, but it is fundamental that the experts who often spot online harms before they properly emerge be consulted and included in this process if we are to truly capture the priority harms to children, as the new clause intends.

We also welcome the clarification in subsection (2) that Ofcom will be required to provide “examples of content” that would be considered to be—or not be—harmful. These examples will be key to ensuring that the platforms have nowhere to hide when it comes to deciding what is harmful; there will be no grey area. Ofcom will have the power to show them exact examples of what could be deemed harmful.

We recognise, however, that there is subjectivity to the work that Ofcom will have to do once the Bill passes. On priority content, it is most important that providers are clear about what is and is not acceptable; that is why we welcome the new clause, but we do of course wish that the Government applied the same logic to harm pertaining to adults online.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I am also happy to support new clause 1, but I have a couple of questions. It mentions that “replacement guidance” may be provided, which is important because, as we have said a number of times, things will change, and we will end up with a different online experience; that can happen quickly. I am glad that Ofcom has the ability to refresh and update the guidance.

My question is about timelines. There do not seem to be any timelines in the new clause for when the guidance is required to be published. It is key that the guidance be published before companies and organisations have to comply with it. My preference would be for it to be published as early as possible. There may well need to be more work, and updated versions of the guidance may therefore need to be published, but I would rather companies had an idea of the direction of travel, and what they must comply with, as soon as possible, knowing that it might be tweaked. That would be better than waiting until the guidance was absolutely perfect and definitely the final version, but releasing it just before people had to start complying with it. I would like an assurance that Ofcom will make publishing the guidance a priority, so that there is enough time to ensure compliance. We want the Bill to work; it will not work if people do not know what they have to comply with. Assurance on that would be helpful.

Paul Scully Portrait Paul Scully
- Hansard - -

I absolutely give that assurance to the hon. Lady; that is important. We all want the measures to be implemented, and the guidance to be out there, as soon as possible. Just now I talked about the platforms bringing in measures as soon as possible, without waiting for the implementation period. They can do that far better if they have the guidance. We are already working with Ofcom to ensure that the implementation period is as short as possible, and we will continue to do so.

Question put and agreed to.

New clause 1 accordingly read a Second time, and added to the Bill.

New Clause 2

Restricting users’ access to content

“(1) This section applies for the purposes of this Part.

(2) References to restricting users’ access to content, and related references, include any case where a provider takes or uses a measure which has the effect that—

(a) a user is unable to access content without taking a prior step (whether or not taking that step might result in access being denied), or

(b) content is temporarily hidden from a user.

(3) But such references do not include any case where—

(a) the effect mentioned in subsection (2) results from the use or application by a user of features, functionalities or settings which a provider includes in a service in compliance with the duty set out in section 14(2) (user empowerment), or

(b) access to content is controlled by another user, rather than the provider.

(4) See also section 203(5).”—(Paul Scully.)

This new clause deals with the meaning of references to restricting users’ access to content, in particular by excluding restrictions resulting from the use of user empowerment tools as described in clause 14.

Brought up, read the First and Second time, and added to the Bill.

New Clause 3

Duty not to act against users except in accordance with terms of service

“(1) A provider of a Category 1 service must operate the service using proportionate systems and processes designed to ensure that the provider does not—

(a) take down regulated user-generated content from the service,

(b) restrict users’ access to regulated user-generated content, or

(c) suspend or ban users from using the service,

except in accordance with the terms of service.

(2) Nothing in subsection (1) is to be read as preventing a provider from taking down content from a service or restricting users’ access to it, or suspending or banning a user, if such an action is taken—

(a) to comply with the duties set out in—

(i) section 9(2) or (3) (protecting individuals from illegal content), or

(ii) section 11(2) or (3) (protecting children from content that is harmful to children), or

(b) to avoid criminal or civil liability on the part of the provider that might reasonably be expected to arise if such an action were not taken.

(3) In addition, nothing in subsection (1) is to be read as preventing a provider from—

(a) taking down content from a service or restricting users’ access to it on the basis that a user has committed an offence in generating, uploading or sharing it on the service, or

(b) suspending or banning a user on the basis that—

(i) the user has committed an offence in generating, uploading or sharing content on the service, or

(ii) the user is responsible for, or has facilitated, the presence or attempted placement of a fraudulent advertisement on the service.

(4) The duty set out in subsection (1) does not apply in relation to—

(a) consumer content (see section (Interpretation of this Chapter));

(b) terms of service which deal with the treatment of consumer content.

(5) If a person is the provider of more than one Category 1 service, the duty set out in subsection (1) applies in relation to each such service.

(6) The duty set out in subsection (1) extends only to the design, operation and use of a service in the United Kingdom, and references in this section to users are to United Kingdom users of a service.

(7) In this section—

‘criminal or civil liability’ includes such a liability under the law of a country outside the United Kingdom;

‘fraudulent advertisement’ has the meaning given by section 35;

‘offence’ includes an offence under the law of a country outside the United Kingdom.

(8) See also section 16 (duties to protect news publisher content).”—(Paul Scully.)

This new clause imposes a duty on providers of Category 1 services to ensure that they do not take down content or restrict users’ access to it, or suspend or ban users, except in accordance with the terms of service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 4

Further duties about terms of service

All services

“(1) A provider of a regulated user-to-user service must include clear and accessible provisions in the terms of service informing users about their right to bring a claim for breach of contract if—

(a) regulated user-generated content which they generate, upload or share is taken down, or access to it is restricted, in breach of the terms of service, or

(b) they are suspended or banned from using the service in breach of the terms of service.

Category 1 services

(2) The duties set out in subsections (3) to (7) apply in relation to a Category 1 service, and references in subsections (3) to (9) to ‘provider’ and ‘service’ are to be read accordingly.

(3) A provider must operate a service using proportionate systems and processes designed to ensure that—

(a) if the terms of service state that the provider will take down a particular kind of regulated user-generated content from the service, the provider does take down such content;

(b) if the terms of service state that the provider will restrict users’ access to a particular kind of regulated user-generated content in a specified way, the provider does restrict users’ access to such content in that way;

(c) if the terms of service state cases in which the provider will suspend or ban a user from using the service, the provider does suspend or ban the user in those cases.

(4) A provider must ensure that—

(a) terms of service which make provision about the provider taking down regulated user-generated content from the service or restricting users’ access to such content, or suspending or banning a user from using the service, are—

(i) clear and accessible, and

(ii) written in sufficient detail to enable users to be reasonably certain whether the provider would be justified in taking the specified action in a particular case, and

(b) those terms of service are applied consistently.

(5) A provider must operate a service using systems and processes that allow users and affected persons to easily report—

(a) content which they consider to be relevant content (see section (Interpretation of this Chapter));

(b) a user who they consider should be suspended or banned from using the service in accordance with the terms of service.

(6) A provider must operate a complaints procedure in relation to a service that—

(a) allows for complaints of a kind mentioned in subsection (8) to be made,

(b) provides for appropriate action to be taken by the provider of the service in response to complaints of those kinds, and

(c) is easy to access, easy to use (including by children) and transparent.

(7) A provider must include in the terms of service provisions which are easily accessible (including to children) specifying the policies and processes that govern the handling and resolution of complaints of a kind mentioned in subsection (8).

(8) The kinds of complaints referred to in subsections (6) and (7) are—

(a) complaints by users and affected persons about content present on a service which they consider to be relevant content;

(b) complaints by users and affected persons if they consider that the provider is not complying with a duty set out in any of subsections (1) or (3) to (5);

(c) complaints by a user who has generated, uploaded or shared content on a service if that content is taken down, or access to it is restricted, on the basis that it is relevant content;

(d) complaints by users who have been suspended or banned from using a service.

(9) The duties set out in subsections (3) and (4) do not apply in relation to terms of service which—

(a) make provision of the kind mentioned in section 9(5) (protecting individuals from illegal content) or 11(5) (protecting children from content that is harmful to children), or

(b) deal with the treatment of consumer content.

Further provision

(10) If a person is the provider of more than one regulated user-to-user service or Category 1 service, the duties set out in this section apply in relation to each such service.

(11) The duties set out in this section extend only to the design, operation and use of a service in the United Kingdom, and references to users are to United Kingdom users of a service.

(12) See also section 16 (duties to protect news publisher content).”—(Paul Scully.)

Subsections (3) to (8) of this new clause impose new duties on providers of Category 1 services in relation to terms of service that allow a provider to take down content or restrict users’ access to it, or to suspend or ban users. Such terms of service must be clear and applied consistently. Subsection (1) of the clause contains a duty which, in part, was previously in clause 20 of the Bill.

Brought up, read the First and Second time, and added to the Bill.

New Clause 5

OFCOM’s guidance about duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service)

“(1) OFCOM must produce guidance for providers of Category 1 services to assist them in complying with their duties set out in sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service)(3) to (7).

(2) OFCOM must publish the guidance (and any revised or replacement guidance).”—(Paul Scully.)

This new clause requires OFCOM to give guidance to providers about complying with the duties imposed by NC3 and NC4.

Brought up, read the First and Second time, and added to the Bill.

New Clause 6

Interpretation of this Chapter

“(1) This section applies for the purposes of this Chapter.

(2) “Regulated user-generated content” has the same meaning as in Part 3 (see section 50), and references to such content are to content that is regulated user-generated content in relation to the service in question.

(3) “Consumer content” means—

(a) regulated user-generated content that constitutes, or is directly connected with content that constitutes, an offer to sell goods or to supply services,

(b) regulated user-generated content that amounts to an offence under the Consumer Protection from Unfair Trading Regulations 2008 (S.I. 2008/1277) (construed in accordance with section 53: see subsections (3), (11) and (12) of that section), or

(c) any other regulated user-generated content in relation to which an enforcement authority has functions under those Regulations (see regulation 19 of those Regulations).

(4) References to restricting users’ access to content, and related references, are to be construed in accordance with sections (Restricting users’ access to content) and 203(5).

(5) Content of a particular kind is “relevant content” if—

(a) a term of service, other than a term of service mentioned in section (Further duties about terms of service)(9), states that a provider may or will take down content of that kind from the service or restrict users’ access to content of that kind, and

(b) it is regulated user-generated content.

References to relevant content are to content that is relevant content in relation to the service in question.

(6) “Affected person” means a person, other than a user of the service in question, who is in the United Kingdom and who is—

(a) the subject of the content,

(b) a member of a class or group of people with a certain characteristic targeted by the content,

(c) a parent of, or other adult with responsibility for, a child who is a user of the service or is the subject of the content, or

(d) an adult providing assistance in using the service to another adult who requires such assistance, where that other adult is a user of the service or is the subject of the content.

(7) In determining what is proportionate for the purposes of sections (Duty not to act against users except in accordance with terms of service) and (Further duties about terms of service), the size and capacity of the provider of a service is, in particular, relevant.

(8) For the meaning of “Category 1 service”, see section 83 (register of categories of services).”—(Paul Scully.)

This new clause gives the meaning of terms used in NC3 and NC4.

Brought up, read the First and Second time, and added to the Bill.

New Clause 7

List of emerging Category 1 services

“(1) As soon as reasonably practicable after the first regulations under paragraph 1(1) of Schedule 11 come into force (regulations specifying Category 1 threshold conditions), OFCOM must comply with subsections (2) and (3).

(2) OFCOM must assess each regulated user-to-user service which they consider is likely to meet each of the following conditions, to determine whether the service does, or does not, meet them—

(a) the first condition is that the number of United Kingdom users of the user-to-user part of the service is at least 75% of the figure specified in any of the Category 1 threshold conditions relating to number of users (calculating the number of users in accordance with the threshold condition in question);

(b) the second condition is that—

(i) at least one of the Category 1 threshold conditions relating to functionalities of the user-to-user part of the service is met, or

(ii) if the regulations under paragraph 1(1) of Schedule 11 specify that a Category 1 threshold condition relating to a functionality of the user-to-user part of the service must be met in combination with a Category 1 threshold condition relating to another characteristic of that part of the service or a factor relating to that part of the service (see paragraph 1(4) of Schedule 11), at least one of those combinations of conditions is met.

(3) OFCOM must prepare a list of regulated user-to-user services which meet the conditions in subsection (2).

(4) The list must contain the following details about a service included in it—

(a) the name of the service,

(b) a description of the service,

(c) the name of the provider of the service, and

(d) a description of the Category 1 threshold conditions by reference to which the conditions in subsection (2) are met.

(5) OFCOM must take appropriate steps to keep the list up to date, including by carrying out further assessments of regulated user-to-user services.

(6) OFCOM must publish the list when it is first prepared and each time it is revised.

(7) When assessing whether a service does, or does not, meet the conditions in subsection (2), OFCOM must take such steps as are reasonably practicable to obtain or generate information or evidence for the purposes of the assessment.

(8) An assessment for the purposes of this section may be included in an assessment under section 83 or 84 (as the case may be) or carried out separately.”—(Paul Scully.)

This new clause requires OFCOM to prepare and keep up to date a list of regulated user-to-user services that have 75% of the number of users of a Category 1 service, and at least one functionality of a Category 1 service or one required combination of a functionality and another characteristic or factor of a Category 1 service.

Brought up, read the First and Second time, and added to the Bill.

New Clause 8

Child user empowerment duties

“(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section “non-verified user” means a user who has not verified their identity to the provider of a service (see section 58(1)).

(12) In this section references to features include references to functionalities and settings.”—(Kirsty Blackman.)

Brought up, and read the First time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

That was some stretch of procedure, Dame Angela, but we got there in the end. This new clause is about child user empowerment duties. I am really pleased that the Government have user empowerment duties in the Bill—they are a good thing—but I am confused as to why they apply only to adult users, and why children do not deserve the same empowerment rights over what they access online.

In writing the new clause, I pretty much copied clause 14, before there were any amendments to it, and added a couple of extra bits: subsections (8) and (9). In subsection (8), I have included:

“A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.”

That would go a step further than the verification process and allow users to approve only people who are in their class at school, people with whom they are friends, or even certain people in their class at school, and to not have others on that list. I know that young people playing Fortnite—I have mentioned Fortnite a lot because people play it a lot—or Roblox are contacted by users whom they do not know, and there is no ability for young people to switch off some of the features while still being able to contact their friends. Users can either have no contact from anyone, or they can have a free-for-all. That is not the case for all platforms, but a chunk of them do not let users speak only to people on their friends list, or receive messages only from people on the list.

My proposed subsection (8) would ensure that children could have a “white list” of people who they believe are acceptable, and who they want to be contacted by, and could leave others off the list. That would help tackle not just online child exploitation, but the significant online bullying that teachers and children report. Children have spoken of the harms they experience as a result of people bullying them and causing trouble online; the perpetrators are mainly other children. Children would be able to remove such people from the list and so would not receive any content, messages or comments from those who make their lives more negative.

Subsection (9) is related to subsection (8); it would require a service to include

“features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.”

Adults looking to exploit children will use private messaging on platforms such as Instagram. Instagram has to know how old its users are, so anybody who is signed up to it will have had to provide it with their date of birth. It is completely reasonable for a child to say, “I want to filter out everything from an adult.” When we talk about children online, we are talking about anybody from zero to 18, which is a very wide age range. Some of those people will be working and paying bills, but will not have access to the empowerment features that adults have access to, because they have not yet reached that magical threshold. Some services may decide to give children access to user empowerment tools, but there is no requirement to. The only requirement in the Bill on user empowerment tools is for adults. That is not fair.

Children should have more control over the online environment. We know how many children feel sad as a result of their interactions online, and how many encounter content online that they wish they had never seen and cannot unsee. We should give them more power over that, and more power to say, “No, I don’t want to see that. I don’t want people I don’t know contacting me. I don’t want to get unsolicited messaged. I don’t want somebody messaging me, pretending that they are my friend or that they go to another school, when they are in fact an adult, and I won’t realise until it is far too late.”

The Bill applies to people of all ages. All of us make pretty crappy decisions sometimes. That includes teenagers, but they also make great decisions. If there was a requirement for them to have these tools, they could choose to make their online experience better. I do not think this was an intentional oversight, or that the Government set out to disadvantage children when they wrote the adult user empowerment clauses. I think they thought that it would be really good to have those clauses in the Bill, in order to give users a measure of autonomy over their time and interactions online. However, they have failed to include the same thing for children. It is a gap.

I appreciate that there are child safety duties, and that there is a much higher bar for platforms that have child users, but children are allowed a level of autonomy; look at the UN convention on the rights of the child. We give children choices and flexibilities; we do not force them to do every single thing they do, all day every day. We recognise that children should be empowered to make decisions where they can.

I know the Government will not accept the provision—I am not an idiot. I have never moved a new clause in Committee that has been accepted, and I am pretty sure that it will not happen today. However, if the Government were to say that they would consider, or even look at the possibility of, adding child user empowerment duties to the Bill, the internet would be a more pleasant place for children. They are going to use it anyway; let us try to improve their online experience even more than the Bill does already.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The hon. Member for Aberdeen North has outlined the case for the new clause eloquently and powerfully. She may not press it to a Division, if the Minister can give her assurances, but if she did, she would have the wholehearted support of the Opposition.

We see new clause 8 as complementing the child safety duties in the legislation. We fully welcome provisions that provide children with greater power and autonomy in choosing to avoid exposure to certain types of content. We have concerns about how the provisions would work in practice, but that issue has more to do with the Government’s triple-shield protections than the new clause.

The Opposition support new clause 8 because it aims to provide further protections, in addition to the child safety duties, to fully protect children from harmful content and to empower them. It would empower and enable them to filter out private messages from adults or non-verified users. We also welcome the measures in the new clause that require platforms and service providers to design accessible terms of service. That is absolutely vital to best protect children online, which is why we are all here, and what the legislation was designed for.

Paul Scully Portrait Paul Scully
- Hansard - -

The aim of the user empowerment duty is to give adults more control over certain categories of legal content that some users will welcome greater choice over. Those duties also give adult users greater control over who they interact with online, but these provisions are not appropriate for children. As the hon. Member for Aberdeen North acknowledged, there are already separate duties on services likely to be accessed by children, in scope of part 3, to undertake comprehensive risk assessments and to comply with safety duties to protect children from harm. That includes requirements to assess how many specific functionalities may facilitate the spread of harmful content, as outlined in clause 10(6)(e), and to protect children from harmful content, including content that has been designated as priority harmful content, by putting in place age-appropriate protections.

As such, children will not need to be provided with tools to control any harmful content they see, as the platform will need to put in place age-appropriate protections. We do not want to give children an option to choose to see content that is harmful to them. The Bill also outlines in clause 11(4)(f) that, where it is proportionate to do so, service providers will be required to take measures in certain areas to meet the child-safety duties. That includes functionalities allowing for control over content that is encountered. It would not be appropriate to require providers to offer children the option to verify their identity, due to the safeguarding and data protection risks that that would pose. Although we expect companies to use technologies such as age assurance to protect children on their service, they would only be used to establish age, not identity.

The new clause would create provisions to enable children to filter out private messages from adults and users who are not on an approved list, but the Bill already contains provisions that address the risks of adults contacting children. There are also requirements on service providers to consider how their service could be used for grooming or child sexual exploitation and abuse, and to apply proportionate measures to mitigate those risks. The service providers already have to assess and mitigate the risks. They have to provide the risk assessment, and within it they could choose to mitigate risk by requiring services to prevent unknown users from contacting children.

For the reasons I have set out, the Bill already provides strong protections for children on services that they are likely to access. I am therefore not able to accept the new clause, and I hope that the hon. Member for Aberdeen North will withdraw it.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That was one of the more disappointing responses from the Minister, I am afraid. I would appreciate it if he could write to me to explain which part of the Bill provides protection to children from private messaging. I would be interested to have another look at that, so it would be helpful if he could provide details.

We do not want children to choose to see unsafe stuff, but the Bill is not strong enough on stuff like private messaging or the ability of unsolicited users to contact children, because it relies on the providers noticing that in their risk assessment, and putting in place mitigations after recognising the problem. It relies on the providers being willing to act to keep children safe in a way that they have not yet done.

When I am assisting my children online, and making rules about how they behave online, the thing I worry most about is unsolicited contact: what people might say to them online, and what they might hear from adults online. I am happy enough for them to talk to their friends online—I think that is grand—but I worry about what adults will say to them online, whether by private messaging through text or voice messages, or when they are playing a game online with the ability for a group of people working as a team together to broadcast their voices to the others and say whatever they want to say.

Lastly, one issue we have seen on Roblox, which is marketed as a children’s platform, is people creating games within it—people creating sex dungeons within a child’s game, or having conversations with children and asking the child to have their character take off their clothes. Those things have happened on that platform, and I am concerned that there is not enough protection in place, particularly to address that unsolicited contact. Given the disappointing response from the Minister, I am keen to push this clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

Labour supports new clause 9, as liability is an issue that we have repeatedly raised throughout the passage of the Bill—most recently, on Report. As colleagues will be aware, the new clause would introduce criminal liabilities for directors who failed to comply with their duties. This would be an appropriate first step in ensuring a direct relationship between senior management of platforms and companies, and their responsibilities to protect children from significant harm. As we have heard, this measure would drive a more effective culture of awareness and accountability in relation to online safety at the top of and within the entire regulated firm. It would go some way towards ensuring that online safety was at the heart of the governance structures internally. The Bill must go further to actively promote cultural change and put online safety at the forefront of business models; it must ensure that these people are aware that it is about keeping people safe and that that must be at the forefront, over any profit. A robust corporate and senior management liability scheme is needed, and it needs to be one that imposes personal liability on directors when they put children at risk.

The Minister knows as well as I do that the benefits of doing so would be strong. We have only to turn to the coroner’s comments in the tragic case of Molly Russell’s death—which I know we are all mindful of as we debate this Bill—to fully understand the damaging impact of viewing harmful content online. I therefore urge the Minister to accept new clause 9, which we wholeheartedly support.

Paul Scully Portrait Paul Scully
- Hansard - -

The Government recognise that the intent behind the new clause is to create new criminal offences of non-compliance with selected duties. It would establish a framework for personal criminal offences punishable through fines or imprisonment. It would mean that providers committed a criminal offence if they did not comply with certain duties.

We all want this Bill to be effective. We want it to be on the statute book. It is a question of getting that fine balance right, so that we can properly hold companies to account for the safety of their users. The existing approach to enforcement and senior manager liability strikes the right balance between robust enforcement and deterrent, and ensuring that the UK remains an attractive place to do business. We are confident that the Bill as a whole will bring about the change necessary to ensure that users, especially younger users, are kept safe online.

This new clause tries to criminalise not complying with the Bill’s duties. Exactly what activity would be criminalised is not obvious from the new clause, so it could be difficult for individuals to foresee exactly what type of conduct would constitute an offence. That could lead to unintended consequences, with tech executives driving an over-zealous approach to content take-down for fear of imprisonment, and potentially removing large volumes of innocuous content and so affecting the ability for open debate to take place.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Does the Minister not think that the freedom of speech stuff and the requirement to stick to terms of service that he has put in as safeguards for that are strong enough, then?

Paul Scully Portrait Paul Scully
- Hansard - -

I come back to this point: I think that if people were threatened with personal legal liability, that would stifle innovation and make them over-cautious in their approach. That would remove the balance, disturb the balance, that we have tried to achieve in this iteration of the Bill. Trying to keep internet users, particularly children, safe has to be achieved alongside free speech and not at its expense.

Further, the threat of criminal prosecution for failing to comply with numerous duties also runs a real risk of damaging the attractiveness of the UK as a place to start up and grow a digital business. I want internet users in the future to be able to access all the benefits of the internet safely, but we cannot achieve that if businesses avoid the UK because our enforcement regime is so far out of kilter with international comparators. Instead, the most effective way to ensure that services act to protect people online is through the existing framework and the civil enforcement options that are already provided for in the Bill, overseen by an expert regulator.

--- Later in debate ---
Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I appreciate the Minister’s comments, but from what my hon. Friends the Members for Folkestone and Hythe, for Eastbourne, and for Redditch said this morning about TikTok—these sorts of images get to children within two and a half minutes—it seems that there is a cultural issue, which the hon. Member for Pontypridd mentioned. Including new clause 9 in the Bill would really ram home the message that we are taking this seriously, that the culture needs to change, and that we need to do all that we can. I hope that the Minister will speak to his colleagues in the Ministry of Justice to see what, if anything, can be done.

Paul Scully Portrait Paul Scully
- Hansard - -

I forgot to respond to my hon. Friend’s question about whether I would meet him. I will happily meet him.

Nick Fletcher Portrait Nick Fletcher
- Hansard - - - Excerpts

I appreciate that. We will come back to this issue on Report, but I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

Question proposed, That the Chair do report the Bill, as amended, to the House.

None Portrait The Chair
- Hansard -

It is usual at this juncture for there to be a few thanks and niceties, if people wish to give them.

Paul Scully Portrait Paul Scully
- Hansard - -

I apologise, Dame Angela; I did not realise that I had that formal role, but you are absolutely right.

None Portrait The Chair
- Hansard -

If the Minister does not want niceties, that is up to him.

Paul Scully Portrait Paul Scully
- Hansard - -

Dame Angela, you know that I love niceties. It is Christmas—the festive season! It is a little bit warmer today because we changed room, but we remember the coldness; it reminds us that it is Christmas.

I thank you, Dame Angela, and thank all the Clerks in the House for bringing this unusual recommittal to us all, and schooling us in the recommittal process. I thank Members from all parts of the House for the constructive way in which the Bill has been debated over the two days of recommittal. I also thank the Doorkeepers and my team, many of whom are on the Benches here or in the Public Gallery. They are watching and WhatsApping—ironically, using end-to-end encryption.

None Portrait The Chair
- Hansard -

I was just about to say that encryption would be involved.

Paul Scully Portrait Paul Scully
- Hansard - -

I look forward to continuing the debate on Report.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I thank you, too, Dame Angela. I echo the Minister’s sentiments, and thank all the Clerks, the Doorkeepers, the team, and all the stakeholders who have massively contributed, with very short turnarounds, to the scrutiny of this legislation. I have so appreciated all that assistance and expertise, which has helped me, as shadow Minister, to compile our comments on the Bill following the Government’s recommittal of it to Committee, which is an unusual step. Huge thanks to my colleagues who joined us today and in previous sittings, and to colleagues from across the House, and particularly from the SNP, a number of whose amendments we have supported. We look forward to scrutinising the Bill further when it comes back to the House in the new year.

Online Safety Bill

Paul Scully Excerpts
Siobhan Baillie Portrait Siobhan Baillie
- Hansard - - - Excerpts

The evidence we have received is that it is parents who need the powers. I want to normalise the ability to turn off anonymised accounts. I think we will see children do that very naturally. We should also try to persuade their parents to take those stances and to have those conversations in the home. I obviously need to take up the matter with the hon. Lady and think carefully about it as matters proceed through the other place.

We know that parents are very scared about what their children see online. I welcome what the Minister is trying to do with the Bill and I welcome the legislation and the openness to change it. These days, we are all called rebels whenever we do anything to improve legislation, but the reality is that that is our job. We are sending this legislation to the other House in a better shape.

Paul Scully Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Paul Scully)
- View Speech - Hansard - -

There is a lot to cover in the short time I have, but first let me thank Members for their contributions to the debate. We had great contributions from the hon. Member for Pontypridd (Alex Davies-Jones), my right hon. Friend the Member for Witham (Priti Patel) and the right hon. Member for Barking (Dame Margaret Hodge)—I have to put that right, having not mentioned her last time—as well as from my hon. Friend the Member for Gosport (Dame Caroline Dinenage); the hon. Member for Aberdeen North (Kirsty Blackman); the former Secretary of State, my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright); and the hon. Members for Plymouth, Sutton and Devonport (Luke Pollard), for Reading East (Matt Rodda) and for Leeds East (Richard Burgon).

I would happily meet the hon. Member for Plymouth, Sutton and Devonport to talk about incel content, as he requested, and the hon. Members for Reading East and for Leeds East to talk about Olly Stephens and Joe Nihill. Those are two really tragic examples and it was good to hear the tributes to them and their being mentioned in this place in respect of the changes in the legislation.

We had great contributions from my right hon. Friend the Member for South Northamptonshire (Dame Andrea Leadsom), the hon. Member for Strangford (Jim Shannon) and my hon. Friend the Member for Dover (Mrs Elphicke). I am glad that my hon. Friend the Member for Stone (Sir William Cash) gave a three-Weetabix speech—I will have to look in the Tea Room for the Weetabix he has been eating.

There were great contributions from my hon. Friends the Members for Penistone and Stocksbridge (Miriam Cates) and for Great Grimsby (Lia Nici), from my right hon. Friend the Member for Chelmsford (Vicky Ford) and from my hon. Friend the Member for Yeovil (Mr Fysh). The latter talked about doom-scrolling; I recommend that he speaks to my right hon. Friend the Member for South Holland and The Deepings (Sir John Hayes), whose quoting of G. K. Chesterton shows the advantages of reading books rather than scrolling through a phone. I also thank my hon. Friends the Members for Redditch (Rachel Maclean), for Watford (Dean Russell) and for Stroud (Siobhan Baillie).

I am also grateful for the contributions during the recommittal process. The changes made to the Bill during that process have strengthened the protections that it can offer.

We reviewed new clause 2 carefully, and I am sympathetic to its aims. We have demonstrated our commitment to strengthening protections for children elsewhere in the Bill by tabling a series of amendments at previous stages, and the Bill already includes provisions to make senior managers liable for failing to prevent a provider from committing an offence and for failing to comply with information notices. We are committed to ensuring that children are safe online, so we will work with those Members and others to bring to the other place an effective amendment that delivers our shared aims of holding people accountable for their actions in a way that is effective and targeted at child safety, while ensuring that the UK remains an attractive place for technology companies to invest and grow.

We need to take time to get this right. We intend to base our amendments on the Irish Online Safety and Media Regulation Act 2022, which, ironically, was largely based on our work here, and which introduces individual criminal liability for failure to comply with the notice to end contravention. In line with that approach, the final Government amendment, at the end of the ping-pong between the other place and this place, will be carefully designed to capture instances in which senior managers, or those purporting to act in that capacity, have consented or connived in ignoring enforceable requirements, risking serious harm to children. The criminal penalties, including imprisonment or fines, will be commensurate with those applying to similar offences. While the amendment will not affect those who have acted in good faith to comply in a proportionate way, it will give the Act additional teeth—as we have heard—to deliver the change that we all want, and ensure that people are held to account if they fail to protect children properly.

As was made clear by my right hon. Friend the Member for Witham, child protection and strong implementation are at the heart of the Bill. Its strongest protections are for children, and companies will be held accountable for their safety. I cannot guarantee the timings for which my right hon. Friend asked, but we will not dilute our commitment. We have already started to speak to companies in this sphere, and I will also continue to work with her and others.

Sajid Javid Portrait Sajid Javid (Bromsgrove) (Con)
- Hansard - - - Excerpts

My hon. Friend has rightly prioritised the protection of children. He will recall that throughout the debate, a number of Members have asked the Government to consider the amendment that will be tabled by Baroness Kidron, which will require coroners to have access to data in cases in which the tragic death of a child may be related to social media and other online activities. Is my hon. Friend able to give a commitment from the Dispatch Box that the Government will look favourably on that amendment?

Paul Scully Portrait Paul Scully
- Hansard - -

Coroners already have some powers in this area, but we are aware of instances raised by my right hon. Friend and others in which that has not been the case. We will happily work with Baroness Kidron, and others, and look favourably on changes where they are necessary.

Debbie Abrahams Portrait Debbie Abrahams
- Hansard - - - Excerpts

I entirely agree that our focus has been on protecting children, but is the Minister as concerned as I am about the information and misinformation, and about the societal impacts on our democracy, not just in this country but elsewhere? The hon. Member for Watford suggested a Committee that could monitor such impacts. Is that something the Minister will reconsider?

Paul Scully Portrait Paul Scully
- Hansard - -

For the purpose of future-proofing, we have tried to make the Bill as flexible and as technologically neutral as possible so that it can adapt to changes. I think we will need to review it, and indeed I am sure that, as technology changes, we will come back with new legislation in the future to ensure that we continue to be world-beating—but let us see where we end up with that.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

May I follow up my hon. Friend’s response to our right hon. Friend the Member for Bromsgrove (Sajid Javid)? If it is the case that coroners cannot access data and information that they need in order to go about their duties—which was the frustrating element in the Molly Russell case—will the Government be prepared to close that loophole in the House of Lords?

Paul Scully Portrait Paul Scully
- Hansard - -

We will certainly work with others to address that, and if there is a loophole, we will seek to act, because we want to ensure—

Paul Scully Portrait Paul Scully
- Hansard - -

I will not give way for the moment. Oh, actually I will.

Priti Patel Portrait Priti Patel
- Hansard - - - Excerpts

I am grateful to the Minister for giving way. He was commenting on my earlier remarks about new clause 2 and the specifics around a timetable. I completely recognise that much of this work is under development. In my remarks, I asked for a timetable on engagement with the tech firms as well as transparency to this House on the progress being made on developing the regulations around criminal liability. It is important that this House sees that, and that we follow every single stage of that process.

Paul Scully Portrait Paul Scully
- Hansard - -

I thank my right hon. Friend for that intervention. We want to have as many conversations as possible in this area with Members on all sides, and I hope we can be as transparent as possible in that operation. We have already started the conversation. The Secretary of State and I met some of the big tech companies just yesterday to talk about exactly this area.

My hon. Friend the Member for Dover, my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead (Mrs May) and others are absolutely right to highlight concerns about illegal small boat crossings and the harm that can be caused to people crossing in dangerous situations. The use of highly dangerous methods to enter this country, including unseaworthy, small or overcrowded boats and refrigerated lorries, presents a huge challenge to us all. Like other forms of serious and organised crime, organised immigration crime endangers lives, has a corrosive effect on society, puts pressure on border security resources and diverts money from our economy.

As the Prime Minister has said, stopping these crossings is one of the Government’s top priorities for the next year. The situation needs to be resolved and we will not hesitate to take action wherever that can have the most effect, including through this Bill. Organised crime groups continue to facilitate most migrant journeys to the UK and have no respect for human life, exploiting vulnerable migrants, treating them as commodities and knowingly putting people in life-threatening situations. Organised crime gangs are increasingly using social media to facilitate migrant crossings and we need to do more to prevent and disrupt the crimes facilitated through these platforms. We need to share best practice, improve our detection methods and take steps to close illegal crossing routes as the behaviour and methods of organised crime groups evolve.

However, amendment 82 risks having unforeseen consequences for the Bill. It could bring into question the meaning of the term “content” elsewhere in the Bill, with unpredictable implications for how the courts and companies would interpret it. Following constructive discussions with my hon. Friend the Member for Dover and my right hon. Friend the Member for Maidenhead, I can now confirm that in order to better tackle illegal immigration encouraged by organised gangs, the Government will add section 2 of the Modern Slavery Act 2015 to the list of priority offences. Section 2 makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation.

We will also add section 24 of the Immigration Act to the priority offences list in schedule 7. Although the offences in section 2 cannot be carried out online, paragraph 33 of the schedule states the priority illegal content includes the inchoate offences relating to the offences listed. Therefore aiding, abetting, counselling and conspiring in those offences by posting videos of people crossing the channel that show the activity in a positive light could be an offence that is committed online and therefore fall within what is priority illegal content. The result of this amendment would therefore be that platforms would have to proactively remove that content. I am grateful to my hon. Friend the Member for Dover and my right hon. Friends the Members for South Holland and The Deepings and for Maidenhead for raising this important issue and I would be happy to offer them a meeting with my officials to discuss the drafting of this amendment ahead of it being tabled in the other place.

We recognise the strength of feeling on the issue of harmful conversion practices and remain committed to protecting people from these practices and making sure that they can live their lives free from the threat of harm or abuse. We have had constructive engagement with my hon. Friend the Member for Rutland and Melton (Alicia Kearns) on her amendment 84, which seeks to prevent children from seeing harmful online content on conversion practices. It is right that this issue is tackled through a dedicated and tailored legislative approach, which is why we are announcing today that the Government will publish a draft Bill to set out a proposed approach to banning conversion practices. This will apply to England and Wales. The Bill will protect everybody, including those targeted on the basis of their sexuality or being transgender. The Government will publish the Bill shortly and will ask for pre-legislative scrutiny by a Joint Committee in this parliamentary Session.

This is a complex area and pre-legislative scrutiny exists to help ensure that any Bill introduced to Parliament does not cause unintended consequences. It will also ensure that the Bill benefits from stakeholder expertise and input from parliamentarians. The legislation must not, through a lack of clarity, harm the growing number of children and young adults experiencing gender-related distress through inadvertently criminalising or chilling legitimate conversations that parents or clinicians may have with children. This is an important issue, and it needs the targeted and robust approach that a dedicated Bill would provide.

Crispin Blunt Portrait Crispin Blunt (Reigate) (Con)
- Hansard - - - Excerpts

Will the Minister give way?

Paul Scully Portrait Paul Scully
- Hansard - -

I am afraid I have only three minutes, so I am not able to give way.

The Government cannot accept the Labour amendments that would re-add the adult safety duties and the concept of content that is harmful to adults. These duties and the definition of harmful content were removed from the Bill in Committee to protect free speech and to ensure that the Bill does not incentivise tech companies to censor legal content. It is not appropriate for the Government to decide whether legal content is harmful to adult users, and then to require companies to risk assess and set terms for such content. Many stakeholders and parliamentarians are justifiably concerned about the consequences of doing so, and I share those concerns. However, the Government recognise the importance of giving users the tools and information they need to keep themselves safe online, which is why we have introduced to the Bill a fairer, simpler approach for adults—the triple shield.

Members have talked a little about user empowerment. I will not have time to cover all of that, but the Government believe we have struck the right balance of empowering adult users on the content they see and engage with online while upholding the right to free expression. For those reasons, I am not able to accept these amendments, and I hope the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson) will not press them to a vote.

The Government amendments are consequential on removing the “legal but harmful” sections, which were debated extensively in Committee.

The Government recognise the concern of my hon. Friend the Member for Stroud about anonymous online abuse, and I applaud her important campaigning in this area. We expect Ofcom to recommend effective tools for compliance, with the requirement that these tools can be applied by users who wish to filter out non-verified users. I agree that the issue covered by amendment 52 is important, and I am happy to continue working with her to deliver her objectives in this area.

My right hon. Friend the Member for Chelmsford spoke powerfully, and we take the issue incredibly seriously. We are committed to introducing a new communications offence of intentional encouragement and assistance of self-harm, which will apply whether the victim is a child or an adult.

Vicky Ford Portrait Vicky Ford
- Hansard - - - Excerpts

Will my hon. Friend give way?

Paul Scully Portrait Paul Scully
- Hansard - -

I do not have time, but I thank all Members who contributed to today’s debate. I pay tribute to my officials and to all the Ministers who have worked on this Bill over such a long time.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to ask leave to withdraw the clause.

Clause, by leave, withdrawn.

Online Safety Bill

Paul Scully Excerpts
Consideration of Lords amendments
Tuesday 12th September 2023

(1 year, 2 months ago)

Commons Chamber
Read Full debate Online Safety Act 2023 Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Commons Consideration of Lords Amendments as at 12 September 2023 - (12 Sep 2023)
Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

With this it will be convenient to discuss the following:

Lords amendment 349, and Government amendments (a) and (b).

Lords amendment 391, Government amendment (a), and Government consequential amendment (a).

Lords amendment 17, Government motion to disagree, and Government amendments (a) and (b) in lieu.

Amendment (i) to Government amendment (a) in lieu of Lords amendment 17.

Lords amendment 20, and Government motion to disagree.

Lords amendment 22, and Government motion to disagree.

Lords amendment 81, Government motion to disagree, and Government amendments (a) to (c) in lieu.

Lords amendment 148, Government motion to disagree, and Government amendment (a) in lieu.

Lords amendment 1, and amendments (a) and (b).

Lords amendments 2 to 16, 18, 19, 21, 23 to 80, 82 to 147, 149 to 181 and 183 to 188.

Lords amendment 189, and amendment (a) in lieu.

Lords amendments 190 to 216.

Lords amendment 217, and amendment (a).

Lords amendments 218 to 227.

Lords amendment 228, and amendment (a).

Lords amendments 229 and 230.

Lords amendment 231, and amendment (a).

Lords amendments 232 to 319.

Lords amendment 320, and amendment (a).

Lords amendment 321, and amendment (a).

Lords amendments 322 to 348, 350 to 390 and 392 to 424.

Paul Scully Portrait Paul Scully
- Hansard - -

As we know from proceedings in this place, the Online Safety Bill is incredibly important. I am delighted that it is returning to the Commons in great shape, having gone through extensive and thorough scrutiny in the Lords. The Bill is world-leading, and the legislative framework established by it will lead to the creation of a profoundly safer online environment in this country. It will kickstart change where that is sorely needed, and ensure that our children are better protected against pornography and other content that is harmful to them. The Bill will also guard children against perpetrators of abhorrent child sexual exploitation and abuse, and ensure that tech companies take responsibility for tackling such content on their platforms, or be held criminally accountable.

William Cash Portrait Sir William Cash (Stone) (Con)
- Hansard - - - Excerpts

As I am sure my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) will agree, may I say how much we appreciate what the Government have done in relation to the matter just referred to? As the Minister knows, we withdrew our amendment in the House of Commons after discussion, and we had amazingly constructive discussions with the Government right the way through, and also in the House of Lords. I shall refer to that if I am called to speak later, but I simply wanted to put on record our thanks, because this will save so many children’s lives.

Paul Scully Portrait Paul Scully
- Hansard - -

I thank my hon. Friend and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) for all their work on this. I hope that this debate will show that we have listened and tried to work with everybody, including on this important part of the Bill. We have not been able to capture absolutely everything that everybody wants, but we are all determined to ensure that the Bill gets on the statute book as quickly as possible, to ensure that we start the important work of implementing it.

We have amended the Bill to bolster its provisions. A number of topics have been of particular interest in the other place. Following engagement with colleagues on those issues, we have bolstered the Bill’s protections for children, including a significant package of changes relating to age assurance. We have also enhanced protections for adult users.

Sajid Javid Portrait Sajid Javid (Bromsgrove) (Con)
- Hansard - - - Excerpts

My hon. Friend will know that Ministers and officials in his Department have worked extensively—I thank them for that—with me, Baroness Kidron, and the Bereaved Families for Online Safety group, on the amendment that will make it easier for coroners to have access to data from online companies in the tragic cases where that might be a cause of a child’s death. He will also know that there will still be gaps in legislation, but such gaps could be closed by further measures in the Data Protection and Digital Information Bill. His ministerial colleague in the other place has committed the Government to that, so may I invite my hon. Friend to set out more about the Government’s plans for doing just that?

Paul Scully Portrait Paul Scully
- Hansard - -

I thank my right hon. Friend for his work on this, and Baroness Kidron for her work. I will cover that in more detail in a moment, but we remain committed to exploring measures that would facilitate better access to data for coroners under specific circumstances. We are looking for the best vehicle to do that, which includes those possibilities in the Data Protection and Digital Information Bill. We want to ensure that the protections for adult users afford people greater control over their online experience.

--- Later in debate ---
John Hayes Portrait Sir John Hayes (South Holland and The Deepings) (Con)
- Hansard - - - Excerpts

The Minister is setting out a powerful case for how the Government have listened to the overtures in this place and the other place. Further to the interventions from my hon. Friend the Member for Stone (Sir William Cash) and my right hon. Friend the Member for Bromsgrove (Sajid Javid), the former Culture Secretary, will the Minister be clear that the risk here is under-regulation, not over-regulation? Although the internet may be widely used by perfectly good people, the people who run internet companies are anything but daft and more likely to be dastardly.

Paul Scully Portrait Paul Scully
- Hansard - -

This is a difficult path to tread in approaching this issue for the first time. In many ways, these are things that we should have done 10 or 15 years ago, as social media platforms and people’s engagement with them proliferated over that period. Regulation has to be done gently, but it must be done. We must act now and get it right, to ensure that we hold the big technology companies in particular to account, while also understanding the massive benefits that those technology companies and their products provide.

Debbie Abrahams Portrait Debbie Abrahams (Oldham East and Saddleworth) (Lab)
- Hansard - - - Excerpts

I agree with the Minister that this is a groundbreaking Bill, but we must be clear that there are still gaps. Given what he is saying about the requirements for regulation of online social media companies and other platforms, how will he monitor, over a period of time, whether the measures that we have are as dynamic as they need to be to catch up with social media as it develops?

--- Later in debate ---
Paul Scully Portrait Paul Scully
- Hansard - -

The hon. Lady asks an important question, and that is the essence of what we are doing. We have tried to make this Bill flexible and proportionate. It is not technology specific, so that it is as future-proofed as possible. We must obviously lean into Ofcom as it seeks to operationalise the Act once the Bill gains Royal Assent. Ofcom will come back with its reporting, so not only will Government and the Department be a check on this, but Parliament will be able to assess the efficacy of the Bill as the system beds in and as technology and the various platforms move on and develop.

I talked about the offences, and I will just finalise my point about criminal liability. Those offences will be punishable with up to two years in prison.

John Penrose Portrait John Penrose (Weston-super-Mare) (Con)
- Hansard - - - Excerpts

Further to that point about the remaining gaps in the Bill, I appreciate what the Minister says about this area being a moving target. Everybody—not just in this country, but around the world—is having to learn as the internet evolves.

I thank the Minister for Government amendment 241, which deals with provenance and understanding where information posted on the web comes from, and allows people therefore to check whether they want to see it, if it comes from dubious sources. That is an example of a collective harm—of people posting disinformation and misinformation online and attempting to subvert our democratic processes, among other things. I park with him, if I may, the notion that we will have to come back to that area in particular. It is an area where the Bill is particularly weak, notwithstanding all the good stuff it does elsewhere, notably on the areas he has mentioned. I hope that everyone in this House accepts that that area will need to be revisited in due course.

Paul Scully Portrait Paul Scully
- Hansard - -

Undoubtedly we will have to come back to that point. Not everything needs to be in the Bill at this point. We have industry initiatives, such as Adobe’s content security policy, which are good initiatives in themselves, but as we better understand misinformation, disinformation, deepfakes and the proliferation and repetition of fake images, fake text and fake news, we will need to keep ensuring we can stay ahead of the game, as my hon. Friend said. That is why we have made the legislation flexible.

Baroness Hodge of Barking Portrait Dame Margaret Hodge (Barking) (Lab)
- Hansard - - - Excerpts

I have two things to ask. First, will the Minister spell out more clearly how Parliament will be able to monitor the implementation? What mechanisms do we have to do that? Secondly, on director liability, which I warmly welcome—I am pleased that the Government have listened to Back Benchers on this issue—does he not agree that the example we have set in the Bill should be copied in other Bills, such as the Economic Crime and Corporate Transparency Bill, where a similar proposal exists from Back Benchers across the House?

Paul Scully Portrait Paul Scully
- Hansard - -

The right hon. Lady raises some interesting points. We have conversed about harms, so I totally get her point about making sure that we tackle this issue in Parliament and be accountable in Parliament. As I have said, that will be done predominantly by monitoring the Bill through Ofcom’s reporting on what harms it is having to deal with. We have regular engagement with Ofcom, not only here and through the Select Committees, but through the Secretary of State.

On criminal liability, we conversed about that and made sure that we had a liability attached to something specific, rather than the general approach proposed at the beginning. It therefore means that we are not chilling innovation. People can understand, as they set up their approaches and systems, exactly what they are getting into in terms of risk for criminal liability, rather than having the general approach that was suggested at the beginning.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

The review mechanism strikes me as one of the places where the Bill falls down and is weakest, because there is not a dedicated review mechanism. We have needed this legislation for more than 30 years, and we have now got to the point of legislating. Does the Minister understand why I have no faith that future legislation will happen in a timely fashion, when it has taken us so long even to get to this point? Can he give us some reassurance that a proper review will take place, rather than just having Ofcom reports that may or may not be read?

Paul Scully Portrait Paul Scully
- Hansard - -

I have talked about the fact that we have to keep this legislation under review, because the landscape is fast-moving. At every stage that I have been dealing with this Bill, I have said that inevitably we will have to come back. We can make the Bill as flexible, proportionate and tech-unspecific as we can, but things are moving quickly. With all our work on AI, for example, such as the AI summit, the work of the Global Partnership on Artificial Intelligence, the international response, the Hiroshima accord and all the other areas that my hon. Friend the Member for Weston-super-Mare (John Penrose) spoke about earlier, we will have to come back, review it and look at whether the legislation remains world-beating. It is not just about the findings of Ofcom as it reports back to us.

I need to make a bit of progress, because I hope to have time to sum up a little bit at the end. We have listened to concerns about ensuring that the Bill provides the most robust protections for children from pornography and on the use of age assurance mechanisms. We are now explicitly requiring relevant providers to use highly effective age verification or age estimation to protect children from pornography and other primary priority content that is harmful to children. The Bill will also ensure a clear privacy-preserving and future-proofed framework governing the use of age assurance, which will be overseen by Ofcom.

There has been coverage in the media about how the Bill relates to encryption, which has often not been accurate. I take the opportunity to set the record straight. Our stance on challenging sexual abuse online remains the same. Last week in the other place, my noble Friend Lord Parkinson, the Parliamentary Under-Secretary of State for Arts and Heritage, shared recent data from UK police forces that showed that 6,350 offences related to sexual communication with a child were recorded last year alone. Shockingly, 5,500 of those offences took place against primary school-age children. Those appalling statistics illustrate the urgent need for change. The Government are committed to taking action against the perpetrators and stamping out these horrific crimes. The information that social media companies currently give to UK law enforcement contributes to more than 800 arrests or voluntary attendances of suspected child sexual offenders on average every month. That results in an estimated 1,200 children being safeguarded from child sexual abuse.

There is no intention by the Government to weaken the encryption technology used by platforms. As a last resort, on a case-by-case basis, and only when stringent privacy safeguards have been met, Ofcom will have the power to direct companies to make best efforts to develop or source technology to identify and remove illegal child sexual abuse content. We know that this technology can be developed. Before it can be required by Ofcom, such technology must meet minimum standards of accuracy. If appropriate technology does not exist that meets these requirements, Ofcom cannot require its use. That is why the powers include the ability for Ofcom to require companies to make best endeavours to develop or source a new solution.

Damian Collins Portrait Damian Collins (Folkestone and Hythe) (Con)
- Hansard - - - Excerpts

Does my hon. Friend agree that the companies already say in their terms of service that they do not allow illegal use of their products, yet they do not say how they will monitor whether there is illegal use and what enforcement they take? What the Bill gives us, for the first time, is the right for Ofcom to know the answers to those questions and to know whether the companies are even enforcing their own terms of service.

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend makes an important point, and I thank him for the amazing work he has done in getting the Bill to this point and for his ongoing help and support in making sure that we get it absolutely right. This is not about bashing technology companies; it is about not only holding them to account, but bringing them closer, to make sure that we can work together on these issues to protect the children I was talking about.

Despite the breadth of existing safeguards, we recognise the concerns expressed about privacy and technical feasibility in relation to Ofcom’s power to issue CSE or terrorism notices. That is why we introduced additional safeguards in the Lords. First, Ofcom will be required to obtain a skilled person’s report before issuing any warning notice and exercising its powers under clause 122. Ofcom must also provide a summary of the report to the relevant provider when issuing a warning notice. We are confident that in addition to Ofcom’s existing routes of evidence gathering, this measure will help to provide the regulator with the necessary information to determine whether to issue a notice and the requirements that may be put in place.

We also brought forth amendments requiring Ofcom to consider the impact that the use of technology would have on the availability of journalistic content and the confidentiality of journalistic sources when considering whether to issue a notice. That builds on the existing safeguards in clause 133 regarding freedom of expression and privacy.

We recognise the disproportionate levels of harm that women and girls continue to face online, and that is why the Government have made a number of changes to the Bill to strengthen protections for women and girls. First, the Bill will require Ofcom to produce guidance on online harms that disproportionately affect women and girls and to provide examples of best practice to providers, and it will require providers to bring together in one clear place all the measures that they take to tackle online abuse against women and girls on their platforms. The Bill will also require Ofcom to consult the Victims’ Commissioner and the Domestic Abuse Commissioner, in addition to the Children’s Commissioner, while preparing codes of practice. That change to the Bill will ensure that the voices of victims of abuse are brought into the consultation period.

--- Later in debate ---
Baroness Hodge of Barking Portrait Dame Margaret Hodge
- Hansard - - - Excerpts

I am grateful for the amendment, which I think is important. Will the Minister make it clear that he will not accept the amendments tabled by the hon. Member for Yeovil (Mr Fysh).

Paul Scully Portrait Paul Scully
- Hansard - -

Indeed, we will not be accepting those amendments, but I will cover more of that later on, after I have listened to the comments that I know my hon. Friend wants to make.

As a result of the amendment, we have also made a small change to clause 98—the emerging category 1 services list—to ensure that it makes operational sense. Prior to Baroness Morgan’s amendment, a service had to meet the functionality threshold for content and 75% of the user number threshold to be on the emerging services list. Under the amended cause, there is now a plausible scenario where a service could meet the category 1 threshold without meeting any condition based on user numbers, so we had to make the change to ensure that the clause worked in that scenario.

We have always been clear that the design of a service, its functionalities and its other features are key drivers of risk that impact on the risk of harm to children. Baroness Kidron’s amendments 17, 20, 22 and 81 seek to treat those aspects as sources of harm in and of themselves. Although we agree with the objective, we are concerned that they do not work within the legislative framework and risk legal confusion and delaying the Bill. We have worked closely with Baroness Kidron and other parliamentarians to identify alternative ways to make the role that design and functionalities play more explicit. I am grateful to colleagues in both Houses for being so generous with their time on this issue. In particular, I thank again my right hon. and learned Friend the Member for Kenilworth and Southam for his tireless work, which was crucial in enabling the creation of an alternative and mutually satisfactory package of amendments. We will disagree to Lords amendments 17, 20, 22 and 81 and replace them with amendments that make it explicit that providers are required to assess the impact that service design, functionalities and other features have on the risk of harm to children.

On Report, my hon. Friend the Member for Crawley (Henry Smith) raised animal abuse on the internet and asked how we might address such harmful content. I am pleased that the changes we have since made to the Bill fully demonstrate the Government’s commitment to tackling criminal activity relating to animal torture online. It is a cause that Baroness Merron has championed passionately. Her amendment in the other place sought to require the Secretary of State to review certain offences and, depending on the review’s outcome, to list them as priority offences in schedule 7. To accelerate measures to tackle such content, the Government will remove clause 63—the review clause—and instead immediately list section 4(1) of the Animal Welfare Act 2006 as a priority offence. Officials at the Department for Environment, Food and Rural Affairs have worked closely with the Royal Society for the Prevention of Cruelty to Animals and are confident that the offence of unnecessary suffering will capture a broad swathe of behaviour. I hope the whole House will recognise our efforts and those of Baroness Merron and support the amendment.

You will be pleased to know, Mr Deputy Speaker, that I will conclude my remarks. I express my gratitude to my esteemed colleagues both here and in the other place for their continued and dedicated engagement with this complicated, complex Bill during the course of its parliamentary passage. I strongly believe that the Bill, in this form, strikes the right balance in providing the strongest possible protections for both adults and children online while protecting freedom of expression. The Government have listened carefully to the views of Members on both sides of the House, stakeholders and members of the public. The amendments we have made during the Bill’s progress through the Lords have further enhanced its robust and world-leading legislative framework. It is groundbreaking and will ensure the safety of generations to come. I ask Members of the House gathered here to support the Government’s position on the issues that I have spoken about today.

Roger Gale Portrait Mr Deputy Speaker (Sir Roger Gale)
- Hansard - - - Excerpts

I call the Opposition spokesperson.

--- Later in debate ---
Anna Firth Portrait Anna Firth (Southend West) (Con)
- View Speech - Hansard - - - Excerpts

I want to speak briefly about Lords amendments 195 and 153, which would allow Ofcom, coroners and bereaved parents to acquire information and support relating to a child’s use of social media in the event of that child’s tragic death. Specifically, I want to speak about Archie Battersbee, who lived in my constituency but lost his life tragically last year, aged only 12. Archie’s mum, Hollie, was in the Public Gallery at the beginning of the debate, and I hope that she is still present. Hollie found Archie unconscious on the stairs with a ligature around his neck. The brain injury Archie suffered put him into a four-month coma from which, sadly, doctors were unable to save him.

To this day, Hollie believes that Archie may have been taking part in some form of highly dangerous online challenge, but, unable to access Archie’s online data beyond 90 days of his search history, she has been unable to put this devastating question to rest. Like the parents of Molly, Breck, Isaac, Frankie and Sophia, for the last year Hollie has been engaged in a cruel uphill struggle against faceless corporations in her attempt to determine whether her child’s engagement with a digital service contributed to his death. Despite knowing that Archie viewed seven minutes of content and received online messages in the hour and a half prior to his death, she has no way of knowing what may have been said or exactly what he may have viewed, and the question of his online engagement and its potential role in his death remains unsolved.

Lords amendment 195, which will bolster Ofcom’s information-gathering powers, will I hope require a much more humane response from providers in such tragic cases as this. This is vital and much-needed legislation. Had it been in place a year ago, it is highly likely that Hollie could have laid her concerns to rest and perhaps received a pocket of peace in what has been the most traumatic time any parent could possibly imagine.

I also welcome Lords amendment 153, which will mandate the largest providers to put in place a dedicated helpline so that parents who suffer these tragic events will have a direct line and a better way of communicating with social media providers, but the proof of the pudding will obviously be in the eating. I very much hope that social media providers will man that helpline with real people who have the appropriate experience to deal with parents at that tragic time in their lives. I believe that Hollie and the parents of many other children in similar tragic cases will welcome the Government’s amendments that allow Ofcom, coroners and bereaved parents to access their children’s online data via the coroner directing Ofcom.

I pay tribute to the noble Baroness Kidron, to my right hon. Friend the Member for Bromsgrove (Sajid Javid) and to the Bereaved Families for Online Safety group, who have done so much fantastic work in sharing their heartrending stories and opening our eyes to what has been necessary to improve the Online Safety Bill. I also, of course, pay tribute to Ian Russell, to Hollie and to all the other bereaved parents for their dedication to raising awareness of this hugely important issue.

If I could just say one last thing, I have been slipped from the Education Committee to attend this debate today and I would like to give an advert for the Committee’s new inquiry, which was launched on Monday, into the effects of screen time on education and wellbeing. This Bill is not the end of the matter—in many ways it is just the beginning—and I urge all Members please to engage with this incredibly important inquiry by the Education Committee.

Paul Scully Portrait Paul Scully
- View Speech - Hansard - -

I thank all right hon. and hon. Members for their contribution to the debate today and, indeed, right through the passage of this complex Bill.

First, let me turn to the amendments tabled by my hon. Friend the Member for Yeovil (Mr Fysh). I understand that the intention of his amendments is to restrict the reach of the new online safety regulatory regime in a number of ways. I appreciate his concern to avoid unnecessarily burdensome business, and I am sympathetic to his point that the Bill should not inhibit sectors such as the life sciences sector. I reassure him that such sectors are not the target of this regime and that the new regulatory framework is proportionate, risk-based and pro-innovation.

The framework has been designed to capture a range of services where there is a risk of significant harm to users, and the built-in exemptions and categorisations will ensure it is properly targeted. The alternative would be a narrow scope, which would be more likely to inadvertently exempt risky science or to displace harm on to services that are out of scope. The extensive discussion on this point in both Houses has made it clear that such a position is unlikely to be acceptable.

The amendments to the overarching statement that would change the services in scope would introduce unclear and subjective terms, causing issues of interpretation. The Bill is designed so that low-risk services will have to put in place only proportionate measures that reflect the risk of harm to their users and the service provider’s size and capacity, ensuring that small providers will not be overly burdened unless the level of risk requires it.

The amendment that would ensure Ofcom cannot require the use of a proactive technology that introduces weaknesses or vulnerabilities into a provider’s systems duplicates existing safeguards. It also introduces vague terms that could restrict Ofcom’s ability to require platforms to use the most effective measures to address abhorrent illegal activity.

Ofcom must act proportionately, and it must consider whether a less intrusive measure could achieve the same effect before requiring the use of proactive technology. Ofcom also has duties to protect both privacy and private property, including algorithms and code, under the Human Rights Act 1998.

Ian Paisley Portrait Ian Paisley
- Hansard - - - Excerpts

I thank the Minister for engaging with us on access to private property and for setting up, with his officials, a consultation on the right to access a person’s phone after they are deceased or incapacitated. I thank him for incorporating some of those thoughts in what he and the Government are doing today. I hope this is the start of something and that these big digital companies will no longer be able to bully people. The boot will be on the other foot, and the public will own what they have on their digital devices.

Paul Scully Portrait Paul Scully
- Hansard - -

The hon. Gentleman is talking about the access of coroners, families and others to information, following the sad death of Molly Russell. Again, I pay tribute to Ian Russell and all the campaigners. I am glad that we have been able to find an answer to a very complex situation, not only because of its international nature but because of data protection, et cetera.

The measures I have outlined will ensure that risks relating to security vulnerabilities are managed. The Bill is also clear that Ofcom cannot require companies to use proactive technology on privately communicated content, in order to comply with their safety duties, which will provide further safeguards for user privacy and data security.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Will the Minister make it clear that we should expect the companies to use proactive technology, because they already use it to make money by recommending content to people, which is a principal reason for the Bill? If they use proactive technology to make money, they should also use it to keep people safe.

Paul Scully Portrait Paul Scully
- Hansard - -

My hon. Friend absolutely nails it. He said earlier that businesses are already collecting this data. Since I was first involved with the Bill, it has primarily been about getting businesses to adhere to their own terms and conditions. The data they use should be used in that way.

The amendment to the definition of “freedom of expression” in part 12 would have no effect as these concepts are already covered by the existing definition. Changing the definition of “automated tool” would introduce untested terms and would have an unclear and confusing impact on the duties.

My hon. Friend the Member for Yeovil also asked for clarification of how Ofcom’s power to view information remotely will be used, and whether the power is sufficiently safeguarded. I assure the House that this power is subject to strict safeguards that mean it cannot be use to undermine a provider’s systems.

On Third Reading in the other place, the Government introduced amendments that defined the regulator’s power to view information remotely, whereas previously the Bill spoke of access. As such, there are no risks to system security, as the power does not enable Ofcom to access the service. Ofcom also has a duty to act proportionately and must abide by its privacy obligations under the Human Rights Act. Ofcom has a stringent restriction on disclosing businesses’ commercially sensitive and other information without consent.

My hon. Friend also asked for clarification on whether Ofcom will be able to view live user data when using this power. Generally, Ofcom would expect to require a service to use a test dataset. However, there may be circumstances where Ofcom asks a service to execute a test using data that it holds, for example, in testing how content moderation systems respond to certain types of content on a service as part of an assessment of the systems and processes. In that scenario, Ofcom may need to use a provider’s own test dataset containing content that has previously violated its own terms of service. However, that would be subject to Ofcom’s privacy obligations and data protection law.

Lords amendment 17 seeks to explicitly exempt low-risk functionality from aspects of user-to-user services’ children’s risk assessment duties. I am happy to reassure my hon. Friend that the current drafting of the Government’s amendment in lieu of Lords amendment 17 places proportionate requirements on providers. It explicitly excludes low-risk functionality from the more stringent duty to identify and assess the impact that higher-risk functionalities have on the level of risk of harm to children. Proportionality is further baked into this duty through Ofcom’s risk assessment guidance. Ofcom is bound by the principle of proportionality as part of its general duties under the Communications Act 2003, as updated by the Bill. As such, it would not be able to recommend that providers should identify and assess low-risk functionality.

The amendment to Lords amendment 217 tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis) would introduce a new safeguard that requires Ofcom to consider whether technology required under a clause 122 notice would circumvent end-to-end encryption. I wish to reassure him and others who have raised the question that the amendment is unnecessary because it is duplicative of existing measures that restrict Ofcom’s use of its powers. Under the Bill’s safeguards, Ofcom cannot require platforms to weaken or remove encryption, and must already consider the risk that specified technology can result in a breach of any statutory provision or the rule of law concerning privacy. We have intentionally designed the Bill so that it is technology neutral and futureproofed, so we cannot accept amendments that risk the legislation quickly becoming out of date. That is why we focused on safeguards that uphold user rights and ensure measures that are proportionate to the specific risks, rather than focusing on specific features such as encryption. For the reasons I have set out, I cannot accept the amendment and hope it will not be pressed to a vote.

The amendment tabled by my hon. Friend the Member for Stroud (Siobhan Baillie) would create an additional reporting requirement on Ofcom to review, as part of its report on the use of the age assurance, whether the visibility of a user’s verification status improves the effectiveness of age assurance, but that duplicates existing review requirements in the Bill. The Bill already provides for a review of user verification; under clause 179, the Secretary of State will be required to review the operation of the online safety regulatory framework as a whole. This review must assess how effective the regulatory framework is at minimising the risk of harm that in scope services pose to users in the UK. That may include a review of the effectiveness of the current user verification and non-verified users duty. I thank my hon. Friend also for raising the issue of user verification and the visibility of verification status. I am pleased to confirm that Ofcom will have the power to set out guidance on user verification status being visible to all users. With regard to online fraud or other illegal activity, mandatory user verification and visibility of verification status is something Ofcom could recommend and require under legal safety duties.

Let me quickly cover some of the other points raised in the debate. I thank my hon. Friend the Member for Gosport (Dame Caroline Dinenage), a former Minister, for all her work. She talked about young people and the Bill contains many measures, for example, on self-harm or suicide content, that reflect them and will still help to protect them. On the comments made by the hon. Member for Aberdeen North (Kirsty Blackman) and indeed the shadow Minister, the hon. Member for Pontypridd (Alex Davies-Jones), whom I am glad to see back in her place, there are a number of review points. Clause 179 requires the Secretary of State to review how the Bill is working in practice, and there will be a report resulting from that, which will be laid before Parliament. We also have the annual Ofcom report that I talked about, and most statutory instruments in the Bill will be subject to the affirmative procedure. The Bill refers to a review after two to five years—Ministers can dictate when it takes place within that period—but that is based on allowing a long enough time for the Bill to bed in and be implemented. It is important that we have the ability to look at that in Parliament. The UN convention on the rights of the child principles are already in the Bill. Although the Bill does not cite the report by name, the EU convention principles are all covered in the Bill.

My hon. Friend the Member for Folkestone and Hythe (Damian Collins) did an amazing job in his time in my role, and before and afterwards as Chair of the Joint Committee responsible for the pre-legislative scrutiny of the Online Safety Bill. When he talked about scrutiny, I had the advantage of seeing the wry smile of the officials in the Box behind him. That scrutiny has been going on since 2021. Sarah Connolly, one of our amazing team of officials, has been involved with the Bill since it was just a concept.

Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

As Carnegie UK Trust observed online, a child born on the day the Government first published their original internet safety strategy would now be in its second year of primary school.

Paul Scully Portrait Paul Scully
- Hansard - -

I do not think I need to respond to that, but it goes to show does it not?

My hon. Friend talked about post-legislative scrutiny. Now that we have the new Department of Science, Innovation and Technology, we have extra capacity within Committees to look at various aspects, and not just online safety as important as that is. It also gives us the ability to have sub-Committees. Clearly, we want to make sure that this and all the decisions that we make are scrutinised well. We are always open to looking at what is happening. My hon. Friend talked about Ofcom being able to appoint skilled persons for research—I totally agree and he absolutely made the right point.

My right hon. Friend the Member for Basingstoke (Dame Maria Miller) and the hon. Member for Caithness, Sutherland and Easter Ross (Jamie Stone) talked about cyber- flashing. As I have said, that has come within the scope of the Bill, but we will also be implementing a broader package of offences that will cover the taking of intimate images without consent. To answer my right hon. Friend’s point, yes, we will still look further at that matter.

The hon. Member for Leeds East (Richard Burgon) talked about Joe Nihill. Will he please send my best wishes and thanks to Catherine and Melanie for their ongoing work in this area? It is always difficult, but it is admirable that people can turn a tragedy into such a positive cause. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) made two points with which I absolutely agree. They are very much covered in the Bill and in our thinking as well, so I say yes to both.

My right hon. Friend the Member for Chelmsford (Vicky Ford) and my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) talked about pornography. Clearly, we must build on the Online Safety Bill. We have the pornography review as well, which explores regulation, legislation and enforcement. We very much want to make sure that this is the first stage, but we will look at pornography and the enforcement around that in a deeper way over the next 12 months.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

It has just crossed my mind that the Minister might be saying that he agreed with everything that I said, which cannot be right. Let me be clear about the two points. One was in relation to whether, when we look at design harms, both proportionality and balancing duties are relevant—I think that he is saying yes to both. The other point that I raised with him was around encryption, and whether I put it in the right way in terms of the Government’s position on encryption. If he cannot deal with that now, and I would understand if he cannot, will he write to me and set out whether that is the correct way to see it?

Paul Scully Portrait Paul Scully
- Hansard - -

I thank my right hon. Friend for that intervention. Indeed, end-to-end encrypted services are in the scope of the Bill. Companies must assess the level of risk and meet their duties no matter what their design is.

Vicky Ford Portrait Vicky Ford
- Hansard - - - Excerpts

Can the Minister confirm whether the letter I received from the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar) is accurate?

Paul Scully Portrait Paul Scully
- Hansard - -

I was just coming to that. I thank my right hon. Friend for the rest of her speech. She always speaks so powerfully on eating disorders—on anorexia in particular—and I can indeed confirm the intent behind the Minister’s letter about the creation and use of algorithms.

Finally, I shall cover two more points. My hon. Friend the Member for Stone (Sir William Cash) always speaks eloquently about this. He talked about Brexit, but I will not get into the politics of that. Suffice to say, it has allowed us—as in other areas of digital and technology—to be flexible and not prescriptive, as we have seen in measures that the EU has introduced.

I also ask my hon. Friend the Member for Southend West (Anna Firth) to pass on my thanks and best wishes to Hollie whom I met to talk about Archie Battersbee.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

On the small high-harm platforms that are now in the scope of the Bill, will the Minister join me in thanking Hope Not Hate, the Antisemitism Policy Trust and CST, which have campaigned heavily on this point? While we have been having this debate, the CST has exposed BitChute, one of those small high-harm platforms, for geoblocking some of the hate to comply with legislation but then advertising loopholes and ways to get around that on the platform. Can the Minister confirm that the regulator will be able to take action against such proceedings?

Paul Scully Portrait Paul Scully
- Hansard - -

I will certainly look at that. Our intention is that in all areas, especially relating to children and their protection, that might not fall within the user enforcement duties, we will look to make sure that the work of those organisations is reflected in what we are trying to achieve in the Bill.

We have talked about the various Ministers that have looked after the Bill during its passage, and the Secretary of State was left literally holding the baby in every sense of the word because she continued to work on it while she was on maternity leave. We can see the results of that with the engagement that we have had. I urge all Members on both sides of the House to consider carefully the amendments I have proposed today in lieu of those made in the Lords. I know every Member looks forward eagerly to a future in which parents have surety about the safety of their children online. That future is fast approaching.

I reiterate my thanks to esteemed colleagues who have engaged so passionately with the Bill. It is due to their collaborative spirit that I stand today with amendments that we believe are effective, proportionate and agreeable to all. I hope all Members will feel able to support our position.

Amendment (a) made to Lords amendment 182.

Lords amendment 182, as amended, agreed to.

Amendments (a) and (b) made to Lords amendment 349.

Lords amendment 349, as amended, agreed to.

Amendment (a) made to Lords amendment 391.

Lords amendment 391, as amended, agreed to.

Government consequential amendment (a) made.

Lords amendment 17 disagreed to.

Government amendments (a) and (b) made in lieu of Lords amendment 17.

Lords amendment 20 disagreed to.

Lords amendment 22 disagreed to.

Lords amendment 81 disagreed to.

Government amendments (a) to (c) made in lieu of Lords amendment 81.

Lords amendment 148 disagreed to.

Government amendment (a) made in lieu of Lords amendment 148.

Lords amendments 1 to 16, 18, 19, 21, 23 to 80, 82 to 147, 149 to 181, 183 to 348, 350 to 390, and 392 to 424 agreed to, with Commons financial privileges waived in respect of Lords amendments 171, 180, 181, 317, 390 and 400.

Ordered, That a Committee be appointed to draw up Reasons to be assigned to the Lords for disagreeing to their amendments 20 and 22;

That Paul Scully, Steve Double, Alexander Stafford, Paul Howell, Alex Davies-Jones, Taiwo Owatemi and Kirsty Blackman be members of the Committee;

That Paul Scully be the Chair of the Committee;

That three be the quorum of the Committee.

That the Committee do withdraw immediately.—(Mike Wood.)

Committee to withdraw immediately; reasons to be reported and communicated to the Lords.