Online Safety Bill Debate
Full Debate: Read Full DebateLindsay Hoyle
Main Page: Lindsay Hoyle (Speaker - Chorley)Department Debates - View all Lindsay Hoyle's debates with the Department for Digital, Culture, Media & Sport
(2 years ago)
Commons ChamberBefore I call the Minister to open the debate, I have something to say about the scope of today’s debate. This is day 2 of debate on consideration of the Bill as amended in the Public Bill Committee. We are debating today only the new clauses, amendments and new schedules listed on the selection paper that I have issued today.
Members may be aware that the Government have tabled a programme motion that would recommit certain clauses and schedules to a Public Bill Committee. There will be an opportunity to debate that motion following proceedings on consideration. The Government have also published a draft list of proposed amendments to the Bill that they intend to bring forward during the recommittal process. These amendments are not in scope for today. There will be an opportunity to debate, at a future Report stage, the recommitted clauses and schedules, as amended on recommittal in the Public Bill Committee.
Most of today’s amendments and new clauses do not relate to the clauses and schedules that are being recommitted. These amendments and new clauses have been highlighted on the selection paper. Today will be the final chance for the Commons to consider them: there will be no opportunity for them to be tabled and considered again at any point during the remaining Commons stages.
New Clause 11
Notices to deal with terrorism content or CSEA content (or both)
“(1) If OFCOM consider that it is necessary and proportionate to do so, they may give a notice described in subsection (2), (3) or (4) relating to a regulated user-to-user service or a regulated search service to the provider of the service.
(2) A notice under subsection (1) that relates to a regulated user-to-user service is a notice requiring the provider of the service—
(a) to do any or all of the following—
(i) use accredited technology to identify terrorism content communicated publicly by means of the service and to swiftly take down that content;
(ii) use accredited technology to prevent individuals from encountering terrorism content communicated publicly by means of the service;
(iii) use accredited technology to identify CSEA content, whether communicated publicly or privately by means of the service, and to swiftly take down that content;
(iv) use accredited technology to prevent individuals from encountering CSEA content, whether communicated publicly or privately, by means of the service; or
(b) to use the provider’s best endeavours to develop or source technology for use on or in relation to the service or part of the service, which—
(i) achieves the purpose mentioned in paragraph (a)(iii) or (iv), and
(ii) meets the standards published by the Secretary of State (see section 106(10)).
(3) A notice under subsection (1) that relates to a regulated search service is a notice requiring the provider of the service—
(a) to do either or both of the following—
(i) use accredited technology to identify search content of the service that is terrorism content and to swiftly take measures designed to secure, so far as possible, that search content of the service no longer includes terrorism content identified by the technology;
(ii) use accredited technology to identify search content of the service that is CSEA content and to swiftly take measures designed to secure, so far as possible, that search content of the service no longer includes CSEA content identified by the technology; or
(b) to use the provider’s best endeavours to develop or source technology for use on or in relation to the service which—
(i) achieves the purpose mentioned in paragraph (a)(ii), and
(ii) meets the standards published by the Secretary of State (see section 106(10)).
(4) A notice under subsection (1) that relates to a combined service is a notice requiring the provider of the service—
(a) to do any or all of the things described in subsection (2)(a) in relation to the user-to-user part of the service, or to use best endeavours to develop or source technology as described in subsection (2)(b) for use on or in relation to that part of the service;
(b) to do either or both of the things described in subsection (3)(a) in relation to the search engine of the service, or to use best endeavours to develop or source technology as described in subsection (3)(b) for use on or in relation to the search engine of the service;
(c) to do any or all of the things described in subsection (2)(a) in relation to the user-to-user part of the service and either or both of the things described in subsection (3)(a) in relation to the search engine of the service; or
(d) to use best endeavours to develop or source—
(i) technology as described in subsection (2)(b) for use on or in relation to the user-to-user part of the service, and
(ii) technology as described in subsection (3)(b) for use on or in relation to the search engine of the service.
(5) For the purposes of subsections (2) and (3), a requirement to use accredited technology may be complied with by the use of the technology alone or by means of the technology together with the use of human moderators.
(6) See—
(a) section (Warning notices), which requires OFCOM to give a warning notice before giving a notice under subsection (1), and
(b) section 105 for provision about matters which OFCOM must consider before giving a notice under subsection (1).
(7) A notice under subsection (1) relating to terrorism content present on a service must identify the content, or parts of the service that include content, that OFCOM consider is communicated publicly on that service (see section 188).
(8) For the meaning of “accredited” technology, see section 106(9) and (10).”—(Julia Lopez.)
This clause replaces existing clause 104. The main changes are: for user-to-user services, a notice may require the use of accredited technology to prevent individuals from encountering terrorism or CSEA content; for user-to-user and search services, a notice may require a provider to use best endeavours to develop or source technology to deal with CSEA content.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
Government new clause 12—Warning notices.
Government new clause 20—OFCOM’s reports about news publisher content and journalistic content.
Government new clause 40—Amendment of Enterprise Act 2002.
Government new clause 42—Former providers of regulated services.
Government new clause 43—Amendments of Part 4B of the Communications Act.
Government new clause 44—Repeal of Part 4B of the Communications Act: transitional provision etc.
Government new clause 51—Publication by providers of details of enforcement action.
Government new clause 52—Exemptions from offence under section 152.
Government new clause 53—Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2).
New clause 1—Provisional re-categorisation of a Part 3 service—
“(1) This section applies in relation to OFCOM’s duty to maintain the register of categories of regulated user-to-user services and regulated search services under section 83.
(2) If OFCOM—
(a) consider that a Part 3 service not included in a particular part of the register is likely to meet the threshold conditions relevant to that part, and
(b) reasonably consider that urgent application of duties relevant to that part is necessary to avoid or mitigate significant harm,
New clause 16—Communication offence for encouraging or assisting self-harm—
“(1) In the Suicide Act 1961, after section 3 insert—
“3A Communication offence for encouraging or assisting self-harm
(1) A person (“D”) commits an offence if—
(a) D sends a message,
(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and
(c) D’s act was intended to encourage or assist the infliction of serious physical harm.
(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.
(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.
(4) A person guilty of an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;
(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.
(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.
(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.
(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).
(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—
(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and
(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and
(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.””
This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.
New clause 17—Liability of directors for compliance failure—
“(1) This section applies where OFCOM considers that there are reasonable grounds for believing that a provider of a regulated service has failed, or is failing, to comply with any enforceable requirement (see section 112) that applies in relation to the service.
(2) If OFCOM considers that the failure results from any—
(a) action,
(b) direction,
(c) neglect, or
(d) with the consent
This new clause would enable Ofcom to exercise its enforcement powers under Chapter 6, Part 7 of the Bill against individual directors, managers and other officers at a regulated service provider where it considers the provider has failed, or is failing, to comply with any enforceable requirement.
New clause 23—Financial support for victims support services—
“(1) The Secretary of State must by regulations make provision for penalties paid under Chapter 6 to be used for funding for victims support services.
(2) Those regulations must—
(a) specify criteria setting out which victim support services are eligible for financial support under this provision;
(b) set out a means by which the amount of funding available should be determined;
(c) make provision for the funding to be reviewed and allocated on a three year basis.
(3) Regulations under this section—
(a) shall be made by statutory instrument, and
(b) may not be made unless a draft has been laid before and approved by resolution of each House of Parliament.”
New clause 28—Establishment of Advocacy Body—
“(1) There is to be a body corporate (“the Advocacy Body”) to represent interests of child users of regulated services.
(2) A “child user”—
(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and
(b) includes both any existing child user and any future child user.
(3) The work of the Advocacy Body may include—
(a) representing the interests of child users;
(b) the protection and promotion of these interests;
(c) any other matter connected with those interests.
(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act, including—
(a) safety duties about illegal content, in particular CSEA content;
(b) safety duties protecting children;
(c) “enforceable requirements” relating to children.
(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.
(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.
(7) The Advocacy Body must assess emerging threats to child users of regulated services and must bring information regarding these threats to OFCOM.
(8) The Advocacy Body may undertake research on their own account.
(9) The Secretary of State must either appoint an organisation known to represent children to be designated the functions under this Act, or create an organisation to carry out the designated functions.
(10) The budget of the Advocacy Body will be subject to annual approval by the board of OFCOM.
(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 71).”
New clause 29—Duty to promote media literacy: regulated user-to-user services and search services—
“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.
(2) This section applies only in relation to OFCOM’s duty to regulate—
(a) user-to-user services, and
(b) search services.
(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—
(a) to reach audiences who are less engaged with, and harder to reach through, traditional media literacy initiatives;
(b) to address gaps in the availability and accessibility of media literacy provisions targeted at vulnerable users;
(c) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;
(d) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by—
(i) carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public;
(ii) seeking to ensure, through the exercise of OFCOM’s online safety functions, that providers of regulated services take appropriate measures to improve users’ media literacy;
(iii) seeking to improve the evaluation of the effectiveness of the initiatives and measures mentioned in sub paras (2)(d)(i) and (ii) (including by increasing the availability and adequacy of data to make those evaluations);
(e) to promote better coordination within the media literacy sector.
(4) OFCOM may prepare such guidance about the matters referred to in subsection (2) as it considers appropriate.
(5) Where OFCOM prepares guidance under subsection (4) it must—
(a) publish the guidance (and any revised or replacement guidance); and
(b) keep the guidance under review.
(6) OFCOM must co-operate with the Secretary of State in the exercise and performance of their duty under this section.”
This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.
New clause 30—Media literacy strategy—
“(1) OFCOM must prepare a strategy which sets out how they intend to undertake their duty to promote media literacy in relation to regulated user-to-user services and regulated search services under section (Duty to promote media literacy: regulated user-to-user services and search services).
(2) The strategy must—
(a) set out the steps OFCOM propose to take to achieve the pursuit of the objectives set out in section (Duty to promote media literacy: regulated user-to-user services and search services),
(b) set out the organisations, or types of organisations, that OFCOM propose to work with in undertaking the duty;
(c) explain why OFCOM considers that the steps it proposes to take will be effective;
(d) explain how OFCOM will assess the extent of the progress that is being made under the strategy.
(3) In preparing the strategy OFCOM must have regard to the need to allocate adequate resources for implementing the strategy.
(4) OFCOM must publish the strategy within the period of 6 months beginning with the day on which this section comes into force.
(5) Before publishing the strategy (or publishing a revised strategy), OFCOM must consult—
(a) persons with experience in or knowledge of the formulation, implementation and evaluation of policies and programmes intended to improve media literacy;
(b) the advisory committee on disinformation and misinformation, and
(c) any other person that OFCOM consider appropriate.
(6) If OFCOM have not revised the strategy within the period of 3 years beginning with the day on which the strategy was last published, they must either—
(a) revise the strategy, or
(b) publish an explanation of why they have decided not to revise it.
(7) If OFCOM decides to revise the strategy they must—
(a) consult in accordance with subsection (3), and
(b) publish the revised strategy.”
This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.
New clause 31—Research conducted by regulated services—
“(1) OFCOM may, at any time it considers appropriate, produce a report into how regulated services commission, collate, publish and make use of research.
(2) For the purposes of the report, OFCOM may require services to submit to OFCOM—
(a) a specific piece of research held by the service, or
(b) all research the service holds on a topic specified by OFCOM.”
New clause 34—Factual Accuracy—
“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.
(2) Any Regulated Service must provide an index of the historic factual accuracy of material published by each user who has—
(a) produced user-generated content,
(b) news publisher content, or
(c) comments and reviews on provider contact
(3) The index under subsection (1) must—
(a) satisfy minimum quality criteria to be set by OFCOM, and
(b) be displayed in a way which allows any user easily to reach an informed view of the likely factual accuracy of the content at the same time as they encounter it.”
New clause 35—Duty of balance—
“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.
(2) Any Regulated Service which selects or prioritises particular—
(a) user-generated content,
(b) news publisher content, or
(c) comments and reviews on provider content
New clause 36—Identification of information incidents by OFCOM—
“(1) OFCOM must maintain arrangements for identifying and understanding patterns in the presence and dissemination of harmful misinformation and disinformation on regulated services.
(2) Arrangements for the purposes of subsection (1) must in particular include arrangements for—
(a) identifying, and assessing the severity of, actual or potential information incidents; and
(b) consulting with persons with expertise in the identification, prevention and handling of disinformation and misinformation online (for the purposes of subsection (2)(a)).
(3) Where an actual or potential information incident is identified, OFCOM must as soon as reasonably practicable—
(a) set out any steps that OFCOM plans to take under its online safety functions in relation to that situation; and
(b) publish such recommendations or other information that OFCOM considers appropriate.
(4) Information under subsection (3) may be published in such a manner as appears to OFCOM to be appropriate for bringing it to the attention of the persons who, in OFCOM’s opinion, should be made aware of it.
(5) OFCOM must prepare and issue guidance about how it will exercise its functions under this section and, in particular—
(a) the matters it will take into account in determining whether an information incident has arisen;
(b) the matters it will take into account in determining the severity of an incident; and
(c) the types of responses that OFCOM thinks are likely to be appropriate when responding to an information incident.
(6) For the purposes of this section—
“harmful misinformation or disinformation” means misinformation or disinformation which, taking into account the manner and extent of its dissemination, may have a material adverse effect on users of regulated services or other members of the public;
“information incident” means a situation where it appears to OFCOM that there is a serious or systemic dissemination of harmful misinformation or disinformation relating to a particular event or situation.”
This new clause would insert a new clause into the Bill to give Ofcom a proactive role in identifying and responding to the sorts of information incidents that can occur in moments of crisis.
New clause 37—Duty to promote media literacy: regulated user-to-user services and search services—
“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.
(2) This section applies only in relation to OFCOM’s duty to regulate—
(a) user-to-user services, and
(b) search services.
(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—
(a) to encourage the development and use of technologies and systems in relation to user-to-user services and search services which help to improve the media literacy of members of the public, including in particular technologies and systems which—
(i) indicate the nature of content on a service (for example, show where it is an advertisement);
(ii) indicate the reliability and accuracy of the content; and
(iii) facilitate control over what content is received;
(b) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;
(c) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public.
(4) OFCOM must prepare guidance about—
(a) the matters referred to in subsection (3) as it considers appropriate; and
(b) minimum standards that media literacy initiatives must meet.
(5) Where OFCOM prepares guidance under subsection (4) it must—
(a) publish the guidance (and any revised or replacement guidance); and
(b) keep the guidance under review.
(6) Every report under paragraph 12 of the Schedule to the Office of Communications Act 2002 (OFCOM’s annual report) for a financial year must contain a summary of the steps that OFCOM have taken under subsection (1) in that year.”
This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.
New clause 45—Sharing etc intimate photographs or film without consent—
“(1) A person (A) commits an offence if—
(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and
(b) A does so—
(i) without B’s consent, and
(ii) without reasonably believing that B consents.
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) A person (A) does not commit an offence under this section if A shares a photograph or film of another person (B) with B or a third person (C) if—
(a) the photograph or film only shows activity that would be ordinarily seen on public street, except for a photograph or film of breastfeeding;
(b) the photograph or film was taken in public, where the person depicted was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;
(c) A reasonably believed that the photograph or film, taken in public, showed a person depicted who was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;
(d) the photograph or film has been previously shared with consent in public;
(e) A reasonably believed that the photograph or film had been previously shared with consent in public;
(f) the photograph or film shows a young child and is of a kind ordinarily shared by family and friends;
(g) the photograph or film is of a child shared for that child’s medical care or treatment, where there is parental consent.
(4) A person (A) does not commit an offence under this section if A shares information about where to access a photograph or film where this photograph or film has already been made available to A.
(5) It is a defence for a person charged with an offence under this section to prove that they—
(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime;
(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings;
(c) reasonably believed that the sharing was necessary for the administration of justice;
(d) reasonably believed that the sharing was necessary for a genuine medical, scientific or educational purpose; and
(e) reasonably believed that the sharing was in the public interest.
(6) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(7) References to sharing such a photograph or film with another person include—
(a) sending it to another person by any means, electronically or otherwise;
(b) showing it to another person;
(c) placing it for another person to find; or
(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.
(8) “Photograph” includes the negative as well as the positive version.
(9) “Film” means a moving image.
(10) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(11) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(12) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding 6 months or a fine (or both).”
This new clause creates the offence of sharing an intimate image without consent, providing the necessary exclusions such as for children’s medical care or images taken in public places, and establishing the penalty as triable by magistrates only with maximum imprisonment of 6 months.
New clause 46—Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation—
“(1) A person (A) commits an offence if—
(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and
(b) A does so—
(i) without B’s consent, and
(ii) without reasonably believing that B consents; and
(c) A intends that the subject of the photograph or film will be caused alarm, distress or humiliation by the sharing of the photograph or film.
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(4) References to sharing such a photograph or film with another person include—
(a) sending it to another person by any means, electronically or otherwise;
(b) showing it to another person;
(c) placing it for another person to find; or
(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.
(5) “Photograph” includes the negative as well as the positive version.
(6) “Film” means a moving image.
(7) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(9) A person who commits an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);
(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”
This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.
New clause 47—Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification—
“(1) A person (A) commits an offence if—
(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and
(b) A does so—
(i) without B’s consent, and
(ii) without reasonably believing that B consents; and
(c) A shared the photograph or film for the purpose of obtaining sexual gratification (whether for the sender or recipient).
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(4) References to sharing such a photograph or film with another person include—
(a) sending it to another person by any means, electronically or otherwise;
(b) showing it to another person;
(c) placing it for another person to find; or
(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.
(5) “Photograph” includes the negative as well as the positive version.
(6) “Film” means a moving image.
(7) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(9) A person who commits an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);
(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”
This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.
New clause 48—Threatening to share etc intimate photographs or film—
“(1) A person (A) commits an offence if—
(a) A threatens to share an intimate photograph or film of another person (B) with B or a third person (C); and
(i) A intends B to fear that the threat will be carried out; or A is reckless as to whether B will fear that the threat will be carried out.
(2) “Threatening to share” should be read to include threatening to share an intimate photograph or film that does not exist and other circumstances where it is impossible for A to carry out the threat.
(3) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(4) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(5) References to sharing, or threatening to share, such a photograph or film with another person include—
(a) sending, or threatening to send, it to another person by any means, electronically or otherwise;
(b) showing, or threatening to show, it to another person;
(c) placing, or threatening to place, it for another person to find; or
(d) sharing, or threatening to share, it on or uploading it to a user-to-user service, including websites or online public forums.
(6) “Photograph” includes the negative as well as the positive version.
(7) “Film” means a moving image.
(8) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(9) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(10) A person who commits an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);
(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”
This new clause creates another more serious offence of threatening to share an intimate image, regardless of whether such an image actually exists, and where the sender intends to cause fear, or is reckless to whether they would cause fear, punishable by 12 months through a magistrates’ court or up to three years in a Crown Court.
New clause 49—Special measures in criminal proceedings for offences involving the sharing of intimate images—
“(1) Chapter 1 of Part 2 of the Youth Justice and Criminal Evidence Act 1999 (giving of evidence or information for purposes of criminal proceedings: special measures directions in case of vulnerable and intimidated witnesses) is amended as follows.
(2) In section 17 (witnesses eligible for assistance on grounds of fear or distress about testifying), in subsection (4A) after paragraph (b) insert “(c) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”
This new clause inserts intimate image abuse into legislation that qualifies victims for special measures when testifying in court (such as partitions to hide them from view, video testifying etc.) which is already prescribed by law.
New clause 50—Anonymity for victims of offences involving the sharing of intimate images—
“(1) Section 2 of the Sexual Offences (Amendment) Act 1992 (Offences to which this Act applies) is amended as follows.
(2) In subsection 1 after paragraph (db) insert—
(dc) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”
Similar to NC49, this new clause allows victims of intimate image abuse the same availability for anonymity as other sexual offences to protect their identities and give them the confidence to testify against their abuser without fear of repercussions.
New clause 54—Report on the effect of Virtual Private Networks on OFCOM’s ability to enforce requirements—
“(1) The Secretary of State must publish a report on the effect of the use of Virtual Private Networks on OFCOM’s ability to enforce requirements under section 112.
(2) The report must be laid before Parliament within six months of the passing of this Act.”
New clause 55—Offence of sending communication facilitating modern slavery and illegal immigration—
‘(1) A person (A) commits an offence if—
(a) (A) intentionally shares with a person (B) or with a third person (C) a photograph or film which is reasonably considered to be, or to be intended to be, facilitating or promoting any activities which do, or could reasonably be expected to, give rise to an offence under—
(i) sections 1 (Slavery, servitude and forced labour), 2 (Human trafficking) or 4 (Committing offence with intent to commit an offence under section 2) of the Modern Slavery Act 2015; or
(ii) sections 24 (Illegal Entry and Similar Offences) or 25 (Assisting unlawful immigration etc) of the Immigration Act 1971; and
(a) (A) does so knowing, or when they reasonably ought to have known, that the activities being depicted are unlawful.
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) A person (A) does not commit an offence under this section if—
(a) the sharing is undertaken by or on behalf of a journalist or for journalistic purposes;
(b) the sharing is by a refugee organisation registered in the UK and which falls within the scope of sub-section (3) or section 25A of the Immigration Act 1971;
(c) the sharing is by or on behalf of a duly elected Member of Parliament or other elected representative in the UK.
(4) It is a defence for a person charged under this section to provide that they—
(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime and
(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings.
(5) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both).”
This new clause would create a new criminal offence of intentionally sharing a photograph or film that facilitates or promotes modern slavery or illegal immigration.
Government amendments 234 and 102 to 117.
Amendment 195, in clause 104, page 87, line 10, leave out subsection 1 and insert—
“(1) If OFCOM consider that it is necessary and proportionate to do so, they may—
(a) give a notice described in subsection (2), (3) or (4) relating to a regulated user to user service or a regulated search service to the provider of the service;
(b) give a notice described in subsection (2), (3) or (4) to a provider or providers of Part 3 services taking into account risk profiles produced by OFCOM under section 84.”
Amendment 152, page 87, line 18, leave out ‘whether’.
This amendment is consequential on Amendment 153.
Amendment 153, page 87, line 19, leave out ‘or privately’.
This amendment removes the ability to monitor encrypted communications.
Government amendment 118.
Amendment 204, in clause 105, page 89, line 17, at end insert—
“(ia) the level of risk of the use of the specified technology accessing, retaining or disclosing the identity or provenance of any confidential journalistic source or confidential journalistic material.”
This amendment would require Ofcom to consider the risk of the use of accredited technology by a Part 3 service accessing, retaining or disclosing the identity or provenance of journalistic sources or confidential journalistic material, when deciding whether to give a notice under Clause 104(1) of the Bill.
Government amendments 119 to 130, 132 to 134, 212, 213, 135 and 214.
Amendment 23, in clause 130, page 114, line 3, leave out paragraph (a).
Government amendment 175.
Amendment 160, in clause 141, page 121, line 9, leave out subsection (2).
This amendment removes the bar of conditionality that must be met for super complaints that relate to a single regulated service.
Amendment 24, page 121, line 16, leave out “The Secretary of State” and insert “OFCOM”.
Amendment 25, page 121, line 21, leave out from “(3),” to end of line 24 and insert “OFCOM must consult—
“(a) The Secretary of State, and
“(b) such other persons as OFCOM considers appropriate.”
This amendment would provide that regulations under clause 141 are to be made by OFCOM rather than by the Secretary of State.
Amendment 189, in clause 142, page 121, line 45, leave out from “including” to end of line 46 and insert
“90 day maximum time limits in relation to the determination and notification to the complainant of—”.
This requires the Secretary of State’s guidance to require Ofcom to determine whether a complaint is eligible for the super-complaints procedure within 90 days.
Amendment 26, in clause 146, page 123, line 33, leave out
“give OFCOM a direction requiring”
and insert “may make representations to”.
Amendment 27, page 123, line 36, leave out subsection (2) and insert—
“(2) OFCOM must have due regard to any representations made by the Secretary of State under subsection (1).”
Amendment 28, page 123, line 38, leave out from “committee” to end of line 39 and insert
“established under this section is to consist of the following members—”.
Amendment 29, page 124, line ], leave out from “committee” to “publish” in line 2 and insert
“established under this section must”.
Amendment 30, page 124, line 4, leave out subsection (5).
Amendment 32, page 124, line 4, leave out clause 148.
Government amendments 176, 239, 138, 240, 215, 241, 242, 217, 218, 243, 219, 244, 245, 220, 221, 140, 246, 222 to 224, 247, 225, 248, 226 and 227.
Amendment 194, in clause 157, page 131, line 16, leave out from beginning to end of line 17 and insert—
“(a) B has not consented for A to send or give the photograph or film to B, and”.
Government amendments 249 to 252, 228, 229 and 235 to 237.
Government new schedule 2—Amendments of Part 4B of the Communications Act.
Government new schedule 3—Video-sharing platform services: transitional provision etc.
Government amendment 238
Amendment 35, schedule 11, page 198, line 5, leave out “The Secretary of State” and insert “OFCOM”.
This amendment would give the power to make regulations under Schedule 11 to OFCOM.
Amendment 2, page 198, line 9, leave out “functionalities” and insert “characteristics”.
Amendment 1, page 198, line 9, at end insert—
“(1A) In this schedule, “characteristics” of a service include its functionalities, user base, business model, governance and other systems and processes.”
Amendment 159, page 198, line 9, at end insert—
“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”
This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.
Amendment 36, page 198, line 10, leave out “The Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 37, page 198, line 16, leave out “The Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 3, page 198, line 2, leave out “functionalities” and insert “characteristics”.
Amendment 9, page 198, line 28, leave out “and” and insert “or”.
Amendment 4, page 198, line 29, leave out “functionality” and insert “characteristic”.
Amendment 38, page 198, line 32, leave out “the Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 5, page 198, line 34, leave out “functionalities” and insert “characteristics”.
Amendment 39, page 198, line 37, leave out “the Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 40, page 198, line 41, leave out “the Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 6, page 198, line 4, leave out “functionalities” and insert “characteristics”.
Amendment 7, page 199, line 11, leave out “functionalities” and insert “characteristics”.
Amendment 8, page 199, line 28, leave out “functionalities” and insert “characteristics”.
Amendment 41, page 199, line 3, leave out subparagraphs (5) to (11).
This amendment is consequential on Amendment 35.
Government amendments 230, 253 to 261 and 233.
I was about to speak to the programme motion, Mr Speaker, but you have outlined exactly what I was going to say, so thank you for that—I am glad to get the process right.
I am delighted to bring the Online Safety Bill back to the House for the continuation of Report stage. I start by expressing my gratitude to colleagues across the House for their contributions to the Bill through pre-legislative scrutiny and before the summer recess, and for their engagement with me since I took office as the Minister for Tech and the Digital Economy.
The concept at the heart of this legislation is simple: tech companies, like those in every other sector, must take responsibility for the consequences of their business decisions. As they continue to offer users the latest innovations, they must consider the safety of their users as well as profit. They must treat their users fairly and ensure that the internet remains a place for free expression and robust debate. As Members will be aware, the majority of the Bill was discussed on Report before the summer recess. Our focus today is on the provisions that relate to the regulator’s power and the criminal law reforms. I will take this opportunity also to briefly set out the further changes that the Government recently committed to making later in the Bill’s passage.
Let me take the Government amendments in turn. The Government’s top priority for this legislation has always been the protection of children. We recognise that the particularly abhorrent and pernicious nature of online child sexual exploitation and abuse—CSEA—demands the most robust response possible. Throughout the passage of the Bill, we have heard evidence of the appalling harm that CSEA causes. Repeatedly, we heard calls for strong incentives for companies to do everything they can to innovate and make safety technologies their priority, to ensure that there is no place for offenders to hide online. The Bill already includes a specific power to tackle CSEA, which allows Ofcom, subject to safeguards, to require tech companies to use accredited technology to identify and remove illegal CSEA content in public and private communications. However, we have seen in recent years how the online world has evolved to allow offenders to reach their victims and one another in new ways.
To take that one step further, is it correct that Ofcom would set minimum standards for operators? For example, the Content Authenticity Initiative does not need primary legislation, but is an industry open-standard, open-source format. That is an example of modern technology that all companies could sign up to use, and Ofcom would therefore determine what needs to be done in primary legislation.
Can I be helpful? We did say that our discussions should be within scope, but the Minister is tempting everybody to intervene out of scope. From his own point of view, I would have thought that it would be easier to keep within scope.
Thank you, Mr Speaker; I will just respond to my hon. Friend the Member for Bosworth (Dr Evans). There is a minimum standard in so far as the operators have to adhere to the terms of the Bill. Our aim is to exclude illegal content and ensure that children are as safe as possible within the remit of the Bill.
The changes will ensure a flexible approach so that companies can use their expertise to develop or source the most effective solution for their service, rather than us being prescriptive. That, in turn, supports the continued growth of our digital economy while keeping our citizens safe online.
Order. I am really bothered. I am trying to help the Minister, because although broadening discussion of the Bill is helpful, it is also allowing Members to come in with remarks that are out of scope. If we are going to go out of scope, we could be here a long time. I am trying to support the Minister by keeping him in scope.
Thank you, Mr Speaker; I will try to keep my remarks very much in scope.
The harmful communications offence in clause 151 was a reform to communication offences proposed in the Bill. Since the Bill has been made public, parliamentarians and stakeholders have expressed concern that the threshold that would trigger prosecution for the offence of causing serious distress could bring robust but legitimate conversation into the illegal space. In the light of that concern, we have decided not to take forward the harmful communications offence for now. That will give the Government an opportunity to consider further how the criminal law can best protect individuals from harmful communications, and ensure that protections for free speech are robust.
I am glad that I gave way so that the hon. Lady could raise that point. Baroness Kidron and her organisation have raised that issue with me directly, and they have gathered media support. We will look at that as the Bill goes through this place and the Lords, because we need to see what the powers are at the moment and why they are not working.
Now is the time to take this legislation forward to ensure that it can deliver the safe and transparent online environment that children and adults so clearly deserve.
It is an absolute pleasure to be back in the Chamber to respond on behalf of the Opposition to this incredibly important piece of legislation on its long overdue second day on Report. It certainly has not been an easy ride so far: I am sure that Bill Committee colleagues across the House agree that unpicking and making sense of this unnecessarily complicated Bill has been anything but straightforward.
We should all be incredibly grateful and are all indebted to the many individuals, charities, organisations and families who have worked so hard to bring online safety to the forefront for us all. Today is a particularly important day, as we are joined in the Public Gallery by a number of families who have lost children in connection with online harms. They include Lorin LaFave, Ian Russell, Andy and Judy Thomas, Amanda and Stuart Stephens and Ruth Moss. I sincerely hope that this debate will do justice to their incredible hard work and commitment in the most exceptionally difficult of circumstances.