(2 years ago)
Commons ChamberI beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
Government new clause 12—Warning notices.
Government new clause 20—OFCOM’s reports about news publisher content and journalistic content.
Government new clause 40—Amendment of Enterprise Act 2002.
Government new clause 42—Former providers of regulated services.
Government new clause 43—Amendments of Part 4B of the Communications Act.
Government new clause 44—Repeal of Part 4B of the Communications Act: transitional provision etc.
Government new clause 51—Publication by providers of details of enforcement action.
Government new clause 52—Exemptions from offence under section 152.
Government new clause 53—Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2).
New clause 1—Provisional re-categorisation of a Part 3 service—
“(1) This section applies in relation to OFCOM’s duty to maintain the register of categories of regulated user-to-user services and regulated search services under section 83.
(2) If OFCOM—
(a) consider that a Part 3 service not included in a particular part of the register is likely to meet the threshold conditions relevant to that part, and
(b) reasonably consider that urgent application of duties relevant to that part is necessary to avoid or mitigate significant harm,
New clause 16—Communication offence for encouraging or assisting self-harm—
“(1) In the Suicide Act 1961, after section 3 insert—
“3A Communication offence for encouraging or assisting self-harm
(1) A person (“D”) commits an offence if—
(a) D sends a message,
(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and
(c) D’s act was intended to encourage or assist the infliction of serious physical harm.
(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.
(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.
(4) A person guilty of an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;
(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.
(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.
(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.
(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).
(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—
(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and
(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and
(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.””
This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.
New clause 17—Liability of directors for compliance failure—
“(1) This section applies where OFCOM considers that there are reasonable grounds for believing that a provider of a regulated service has failed, or is failing, to comply with any enforceable requirement (see section 112) that applies in relation to the service.
(2) If OFCOM considers that the failure results from any—
(a) action,
(b) direction,
(c) neglect, or
(d) with the consent
This new clause would enable Ofcom to exercise its enforcement powers under Chapter 6, Part 7 of the Bill against individual directors, managers and other officers at a regulated service provider where it considers the provider has failed, or is failing, to comply with any enforceable requirement.
New clause 23—Financial support for victims support services—
“(1) The Secretary of State must by regulations make provision for penalties paid under Chapter 6 to be used for funding for victims support services.
(2) Those regulations must—
(a) specify criteria setting out which victim support services are eligible for financial support under this provision;
(b) set out a means by which the amount of funding available should be determined;
(c) make provision for the funding to be reviewed and allocated on a three year basis.
(3) Regulations under this section—
(a) shall be made by statutory instrument, and
(b) may not be made unless a draft has been laid before and approved by resolution of each House of Parliament.”
New clause 28—Establishment of Advocacy Body—
“(1) There is to be a body corporate (“the Advocacy Body”) to represent interests of child users of regulated services.
(2) A “child user”—
(a) means any person aged 17 years or under who uses or is likely to use regulated internet services; and
(b) includes both any existing child user and any future child user.
(3) The work of the Advocacy Body may include—
(a) representing the interests of child users;
(b) the protection and promotion of these interests;
(c) any other matter connected with those interests.
(4) The “interests of child users” means the interests of children in relation to the discharge by any regulated company of its duties under this Act, including—
(a) safety duties about illegal content, in particular CSEA content;
(b) safety duties protecting children;
(c) “enforceable requirements” relating to children.
(5) The Advocacy Body must have particular regard to the interests of child users that display one or more protected characteristics within the meaning of the Equality Act 2010.
(6) The Advocacy Body will be defined as a statutory consultee for OFCOM’s regulatory decisions which impact upon the interests of children.
(7) The Advocacy Body must assess emerging threats to child users of regulated services and must bring information regarding these threats to OFCOM.
(8) The Advocacy Body may undertake research on their own account.
(9) The Secretary of State must either appoint an organisation known to represent children to be designated the functions under this Act, or create an organisation to carry out the designated functions.
(10) The budget of the Advocacy Body will be subject to annual approval by the board of OFCOM.
(11) The Secretary of State must give directions to OFCOM as to how it should recover the costs relating to the expenses of the Advocacy Body, or the Secretary of State in relation to the establishment of the Advocacy Body, through the provisions to require a provider of a regulated service to pay a fee (as set out in section 71).”
New clause 29—Duty to promote media literacy: regulated user-to-user services and search services—
“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.
(2) This section applies only in relation to OFCOM’s duty to regulate—
(a) user-to-user services, and
(b) search services.
(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—
(a) to reach audiences who are less engaged with, and harder to reach through, traditional media literacy initiatives;
(b) to address gaps in the availability and accessibility of media literacy provisions targeted at vulnerable users;
(c) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;
(d) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by—
(i) carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public;
(ii) seeking to ensure, through the exercise of OFCOM’s online safety functions, that providers of regulated services take appropriate measures to improve users’ media literacy;
(iii) seeking to improve the evaluation of the effectiveness of the initiatives and measures mentioned in sub paras (2)(d)(i) and (ii) (including by increasing the availability and adequacy of data to make those evaluations);
(e) to promote better coordination within the media literacy sector.
(4) OFCOM may prepare such guidance about the matters referred to in subsection (2) as it considers appropriate.
(5) Where OFCOM prepares guidance under subsection (4) it must—
(a) publish the guidance (and any revised or replacement guidance); and
(b) keep the guidance under review.
(6) OFCOM must co-operate with the Secretary of State in the exercise and performance of their duty under this section.”
This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.
New clause 30—Media literacy strategy—
“(1) OFCOM must prepare a strategy which sets out how they intend to undertake their duty to promote media literacy in relation to regulated user-to-user services and regulated search services under section (Duty to promote media literacy: regulated user-to-user services and search services).
(2) The strategy must—
(a) set out the steps OFCOM propose to take to achieve the pursuit of the objectives set out in section (Duty to promote media literacy: regulated user-to-user services and search services),
(b) set out the organisations, or types of organisations, that OFCOM propose to work with in undertaking the duty;
(c) explain why OFCOM considers that the steps it proposes to take will be effective;
(d) explain how OFCOM will assess the extent of the progress that is being made under the strategy.
(3) In preparing the strategy OFCOM must have regard to the need to allocate adequate resources for implementing the strategy.
(4) OFCOM must publish the strategy within the period of 6 months beginning with the day on which this section comes into force.
(5) Before publishing the strategy (or publishing a revised strategy), OFCOM must consult—
(a) persons with experience in or knowledge of the formulation, implementation and evaluation of policies and programmes intended to improve media literacy;
(b) the advisory committee on disinformation and misinformation, and
(c) any other person that OFCOM consider appropriate.
(6) If OFCOM have not revised the strategy within the period of 3 years beginning with the day on which the strategy was last published, they must either—
(a) revise the strategy, or
(b) publish an explanation of why they have decided not to revise it.
(7) If OFCOM decides to revise the strategy they must—
(a) consult in accordance with subsection (3), and
(b) publish the revised strategy.”
This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.
New clause 31—Research conducted by regulated services—
“(1) OFCOM may, at any time it considers appropriate, produce a report into how regulated services commission, collate, publish and make use of research.
(2) For the purposes of the report, OFCOM may require services to submit to OFCOM—
(a) a specific piece of research held by the service, or
(b) all research the service holds on a topic specified by OFCOM.”
New clause 34—Factual Accuracy—
“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.
(2) Any Regulated Service must provide an index of the historic factual accuracy of material published by each user who has—
(a) produced user-generated content,
(b) news publisher content, or
(c) comments and reviews on provider contact
(3) The index under subsection (1) must—
(a) satisfy minimum quality criteria to be set by OFCOM, and
(b) be displayed in a way which allows any user easily to reach an informed view of the likely factual accuracy of the content at the same time as they encounter it.”
New clause 35—Duty of balance—
“(1) The purpose of this section is to reduce the risk of harm to users of regulated services caused by disinformation or misinformation.
(2) Any Regulated Service which selects or prioritises particular—
(a) user-generated content,
(b) news publisher content, or
(c) comments and reviews on provider content
New clause 36—Identification of information incidents by OFCOM—
“(1) OFCOM must maintain arrangements for identifying and understanding patterns in the presence and dissemination of harmful misinformation and disinformation on regulated services.
(2) Arrangements for the purposes of subsection (1) must in particular include arrangements for—
(a) identifying, and assessing the severity of, actual or potential information incidents; and
(b) consulting with persons with expertise in the identification, prevention and handling of disinformation and misinformation online (for the purposes of subsection (2)(a)).
(3) Where an actual or potential information incident is identified, OFCOM must as soon as reasonably practicable—
(a) set out any steps that OFCOM plans to take under its online safety functions in relation to that situation; and
(b) publish such recommendations or other information that OFCOM considers appropriate.
(4) Information under subsection (3) may be published in such a manner as appears to OFCOM to be appropriate for bringing it to the attention of the persons who, in OFCOM’s opinion, should be made aware of it.
(5) OFCOM must prepare and issue guidance about how it will exercise its functions under this section and, in particular—
(a) the matters it will take into account in determining whether an information incident has arisen;
(b) the matters it will take into account in determining the severity of an incident; and
(c) the types of responses that OFCOM thinks are likely to be appropriate when responding to an information incident.
(6) For the purposes of this section—
“harmful misinformation or disinformation” means misinformation or disinformation which, taking into account the manner and extent of its dissemination, may have a material adverse effect on users of regulated services or other members of the public;
“information incident” means a situation where it appears to OFCOM that there is a serious or systemic dissemination of harmful misinformation or disinformation relating to a particular event or situation.”
This new clause would insert a new clause into the Bill to give Ofcom a proactive role in identifying and responding to the sorts of information incidents that can occur in moments of crisis.
New clause 37—Duty to promote media literacy: regulated user-to-user services and search services—
“(1) In addition to the duty on OFCOM to promote media literacy under section 11 of the Communications Act 2003, OFCOM must take such steps as they consider appropriate to improve the media literacy of the public in relation to regulated user-to-user services and search services.
(2) This section applies only in relation to OFCOM’s duty to regulate—
(a) user-to-user services, and
(b) search services.
(3) OFCOM’s performance of its duty in subsection (1) must include pursuit of the following objectives—
(a) to encourage the development and use of technologies and systems in relation to user-to-user services and search services which help to improve the media literacy of members of the public, including in particular technologies and systems which—
(i) indicate the nature of content on a service (for example, show where it is an advertisement);
(ii) indicate the reliability and accuracy of the content; and
(iii) facilitate control over what content is received;
(b) to build the resilience of the public to disinformation and misinformation by using media literacy as a tool to reduce the harm from that misinformation and disinformation;
(c) to promote greater availability and effectiveness of media literacy initiatives and other measures, including by carrying out, commissioning or encouraging educational initiatives designed to improve the media literacy of the public.
(4) OFCOM must prepare guidance about—
(a) the matters referred to in subsection (3) as it considers appropriate; and
(b) minimum standards that media literacy initiatives must meet.
(5) Where OFCOM prepares guidance under subsection (4) it must—
(a) publish the guidance (and any revised or replacement guidance); and
(b) keep the guidance under review.
(6) Every report under paragraph 12 of the Schedule to the Office of Communications Act 2002 (OFCOM’s annual report) for a financial year must contain a summary of the steps that OFCOM have taken under subsection (1) in that year.”
This new clause places an additional duty on Ofcom to promote media literacy of the public in relation to regulated user-to-user services and search services.
New clause 45—Sharing etc intimate photographs or film without consent—
“(1) A person (A) commits an offence if—
(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and
(b) A does so—
(i) without B’s consent, and
(ii) without reasonably believing that B consents.
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) A person (A) does not commit an offence under this section if A shares a photograph or film of another person (B) with B or a third person (C) if—
(a) the photograph or film only shows activity that would be ordinarily seen on public street, except for a photograph or film of breastfeeding;
(b) the photograph or film was taken in public, where the person depicted was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;
(c) A reasonably believed that the photograph or film, taken in public, showed a person depicted who was voluntarily nude, partially nude or engaging in a sexual act or toileting in public;
(d) the photograph or film has been previously shared with consent in public;
(e) A reasonably believed that the photograph or film had been previously shared with consent in public;
(f) the photograph or film shows a young child and is of a kind ordinarily shared by family and friends;
(g) the photograph or film is of a child shared for that child’s medical care or treatment, where there is parental consent.
(4) A person (A) does not commit an offence under this section if A shares information about where to access a photograph or film where this photograph or film has already been made available to A.
(5) It is a defence for a person charged with an offence under this section to prove that they—
(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime;
(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings;
(c) reasonably believed that the sharing was necessary for the administration of justice;
(d) reasonably believed that the sharing was necessary for a genuine medical, scientific or educational purpose; and
(e) reasonably believed that the sharing was in the public interest.
(6) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(7) References to sharing such a photograph or film with another person include—
(a) sending it to another person by any means, electronically or otherwise;
(b) showing it to another person;
(c) placing it for another person to find; or
(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.
(8) “Photograph” includes the negative as well as the positive version.
(9) “Film” means a moving image.
(10) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(11) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(12) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding 6 months or a fine (or both).”
This new clause creates the offence of sharing an intimate image without consent, providing the necessary exclusions such as for children’s medical care or images taken in public places, and establishing the penalty as triable by magistrates only with maximum imprisonment of 6 months.
New clause 46—Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation—
“(1) A person (A) commits an offence if—
(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and
(b) A does so—
(i) without B’s consent, and
(ii) without reasonably believing that B consents; and
(c) A intends that the subject of the photograph or film will be caused alarm, distress or humiliation by the sharing of the photograph or film.
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(4) References to sharing such a photograph or film with another person include—
(a) sending it to another person by any means, electronically or otherwise;
(b) showing it to another person;
(c) placing it for another person to find; or
(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.
(5) “Photograph” includes the negative as well as the positive version.
(6) “Film” means a moving image.
(7) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(9) A person who commits an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);
(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”
This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.
New clause 47—Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification—
“(1) A person (A) commits an offence if—
(a) A intentionally shares an intimate photograph or film of another person (B) with B or with a third person (C); and
(b) A does so—
(i) without B’s consent, and
(ii) without reasonably believing that B consents; and
(c) A shared the photograph or film for the purpose of obtaining sexual gratification (whether for the sender or recipient).
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(4) References to sharing such a photograph or film with another person include—
(a) sending it to another person by any means, electronically or otherwise;
(b) showing it to another person;
(c) placing it for another person to find; or
(d) sharing it on or uploading it to a user-to-user service, including websites or online public forums.
(5) “Photograph” includes the negative as well as the positive version.
(6) “Film” means a moving image.
(7) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(8) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(9) A person who commits an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);
(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”
This new clause creates a more serious offence where there is the intent to cause alarm etc. by sharing an image, with the appropriately more serious penalty of 12 months through a magistrates’ court or up to three years in a Crown Court.
New clause 48—Threatening to share etc intimate photographs or film—
“(1) A person (A) commits an offence if—
(a) A threatens to share an intimate photograph or film of another person (B) with B or a third person (C); and
(i) A intends B to fear that the threat will be carried out; or A is reckless as to whether B will fear that the threat will be carried out.
(2) “Threatening to share” should be read to include threatening to share an intimate photograph or film that does not exist and other circumstances where it is impossible for A to carry out the threat.
(3) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(4) An “intimate photograph or film” is a photograph or film that is sexual, shows a person nude or partially nude, or shows a person toileting, of a kind which is not ordinarily seen on a public street, which includes—
(a) any photograph or film that shows something a reasonable person would consider to be sexual because of its nature;
(b) any photograph or film that shows something which, taken as a whole, is such that a reasonable person would consider it to be sexual;
(c) any photograph or film that shows a person’s genitals, buttocks or breasts, whether exposed, covered with underwear or anything being worn as underwear, or where a person is similarly or more exposed than if they were wearing only underwear;
(d) any photograph or film that shows toileting, meaning a photograph or film of someone in the act of defecation and urination, or images of personal care associated with genital or anal discharge, defecation and urination.
(5) References to sharing, or threatening to share, such a photograph or film with another person include—
(a) sending, or threatening to send, it to another person by any means, electronically or otherwise;
(b) showing, or threatening to show, it to another person;
(c) placing, or threatening to place, it for another person to find; or
(d) sharing, or threatening to share, it on or uploading it to a user-to-user service, including websites or online public forums.
(6) “Photograph” includes the negative as well as the positive version.
(7) “Film” means a moving image.
(8) References to a photograph or film include—
(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film,
(b) an image which has been altered through computer graphics,
(c) a copy of a photograph, film or image, and
(d) data stored by any means which is capable of conversion into a photograph, film or image.
(9) Sections 74 to 76 of the Sexual Offences Act 2003 apply when determining consent in relation to offences in this section.
(10) A person who commits an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months or a fine (or both);
(b) on conviction on indictment, to imprisonment for a term not exceeding three years.”
This new clause creates another more serious offence of threatening to share an intimate image, regardless of whether such an image actually exists, and where the sender intends to cause fear, or is reckless to whether they would cause fear, punishable by 12 months through a magistrates’ court or up to three years in a Crown Court.
New clause 49—Special measures in criminal proceedings for offences involving the sharing of intimate images—
“(1) Chapter 1 of Part 2 of the Youth Justice and Criminal Evidence Act 1999 (giving of evidence or information for purposes of criminal proceedings: special measures directions in case of vulnerable and intimidated witnesses) is amended as follows.
(2) In section 17 (witnesses eligible for assistance on grounds of fear or distress about testifying), in subsection (4A) after paragraph (b) insert “(c) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”
This new clause inserts intimate image abuse into legislation that qualifies victims for special measures when testifying in court (such as partitions to hide them from view, video testifying etc.) which is already prescribed by law.
New clause 50—Anonymity for victims of offences involving the sharing of intimate images—
“(1) Section 2 of the Sexual Offences (Amendment) Act 1992 (Offences to which this Act applies) is amended as follows.
(2) In subsection 1 after paragraph (db) insert—
(dc) ‘an offence under sections [Sharing etc intimate photographs or film without consent; Sharing etc intimate photographs or film with intent to cause alarm, distress or humiliation; Sharing etc intimate photographs or film without consent for the purpose of obtaining sexual gratification; Threatening to share etc intimate photographs or film] of the Online Safety Act 2023’”.”
Similar to NC49, this new clause allows victims of intimate image abuse the same availability for anonymity as other sexual offences to protect their identities and give them the confidence to testify against their abuser without fear of repercussions.
New clause 54—Report on the effect of Virtual Private Networks on OFCOM’s ability to enforce requirements—
“(1) The Secretary of State must publish a report on the effect of the use of Virtual Private Networks on OFCOM’s ability to enforce requirements under section 112.
(2) The report must be laid before Parliament within six months of the passing of this Act.”
New clause 55—Offence of sending communication facilitating modern slavery and illegal immigration—
‘(1) A person (A) commits an offence if—
(a) (A) intentionally shares with a person (B) or with a third person (C) a photograph or film which is reasonably considered to be, or to be intended to be, facilitating or promoting any activities which do, or could reasonably be expected to, give rise to an offence under—
(i) sections 1 (Slavery, servitude and forced labour), 2 (Human trafficking) or 4 (Committing offence with intent to commit an offence under section 2) of the Modern Slavery Act 2015; or
(ii) sections 24 (Illegal Entry and Similar Offences) or 25 (Assisting unlawful immigration etc) of the Immigration Act 1971; and
(a) (A) does so knowing, or when they reasonably ought to have known, that the activities being depicted are unlawful.
(2) References to a third person (C) in this section are to be read as referring to—
(a) an individual;
(b) a group of individuals;
(c) a section of the public; or
(d) the public at large.
(3) A person (A) does not commit an offence under this section if—
(a) the sharing is undertaken by or on behalf of a journalist or for journalistic purposes;
(b) the sharing is by a refugee organisation registered in the UK and which falls within the scope of sub-section (3) or section 25A of the Immigration Act 1971;
(c) the sharing is by or on behalf of a duly elected Member of Parliament or other elected representative in the UK.
(4) It is a defence for a person charged under this section to provide that they—
(a) reasonably believed that the sharing was necessary for the purposes of preventing, detecting, investigating or prosecuting crime and
(b) reasonably believed that the sharing was necessary for the purposes of legal or regulatory proceedings.
(5) A person who commits an offence under this section is liable on summary conviction, to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both).”
This new clause would create a new criminal offence of intentionally sharing a photograph or film that facilitates or promotes modern slavery or illegal immigration.
Government amendments 234 and 102 to 117.
Amendment 195, in clause 104, page 87, line 10, leave out subsection 1 and insert—
“(1) If OFCOM consider that it is necessary and proportionate to do so, they may—
(a) give a notice described in subsection (2), (3) or (4) relating to a regulated user to user service or a regulated search service to the provider of the service;
(b) give a notice described in subsection (2), (3) or (4) to a provider or providers of Part 3 services taking into account risk profiles produced by OFCOM under section 84.”
Amendment 152, page 87, line 18, leave out ‘whether’.
This amendment is consequential on Amendment 153.
Amendment 153, page 87, line 19, leave out ‘or privately’.
This amendment removes the ability to monitor encrypted communications.
Government amendment 118.
Amendment 204, in clause 105, page 89, line 17, at end insert—
“(ia) the level of risk of the use of the specified technology accessing, retaining or disclosing the identity or provenance of any confidential journalistic source or confidential journalistic material.”
This amendment would require Ofcom to consider the risk of the use of accredited technology by a Part 3 service accessing, retaining or disclosing the identity or provenance of journalistic sources or confidential journalistic material, when deciding whether to give a notice under Clause 104(1) of the Bill.
Government amendments 119 to 130, 132 to 134, 212, 213, 135 and 214.
Amendment 23, in clause 130, page 114, line 3, leave out paragraph (a).
Government amendment 175.
Amendment 160, in clause 141, page 121, line 9, leave out subsection (2).
This amendment removes the bar of conditionality that must be met for super complaints that relate to a single regulated service.
Amendment 24, page 121, line 16, leave out “The Secretary of State” and insert “OFCOM”.
Amendment 25, page 121, line 21, leave out from “(3),” to end of line 24 and insert “OFCOM must consult—
“(a) The Secretary of State, and
“(b) such other persons as OFCOM considers appropriate.”
This amendment would provide that regulations under clause 141 are to be made by OFCOM rather than by the Secretary of State.
Amendment 189, in clause 142, page 121, line 45, leave out from “including” to end of line 46 and insert
“90 day maximum time limits in relation to the determination and notification to the complainant of—”.
This requires the Secretary of State’s guidance to require Ofcom to determine whether a complaint is eligible for the super-complaints procedure within 90 days.
Amendment 26, in clause 146, page 123, line 33, leave out
“give OFCOM a direction requiring”
and insert “may make representations to”.
Amendment 27, page 123, line 36, leave out subsection (2) and insert—
“(2) OFCOM must have due regard to any representations made by the Secretary of State under subsection (1).”
Amendment 28, page 123, line 38, leave out from “committee” to end of line 39 and insert
“established under this section is to consist of the following members—”.
Amendment 29, page 124, line ], leave out from “committee” to “publish” in line 2 and insert
“established under this section must”.
Amendment 30, page 124, line 4, leave out subsection (5).
Amendment 32, page 124, line 4, leave out clause 148.
Government amendments 176, 239, 138, 240, 215, 241, 242, 217, 218, 243, 219, 244, 245, 220, 221, 140, 246, 222 to 224, 247, 225, 248, 226 and 227.
Amendment 194, in clause 157, page 131, line 16, leave out from beginning to end of line 17 and insert—
“(a) B has not consented for A to send or give the photograph or film to B, and”.
Government amendments 249 to 252, 228, 229 and 235 to 237.
Government new schedule 2—Amendments of Part 4B of the Communications Act.
Government new schedule 3—Video-sharing platform services: transitional provision etc.
Government amendment 238
Amendment 35, schedule 11, page 198, line 5, leave out “The Secretary of State” and insert “OFCOM”.
This amendment would give the power to make regulations under Schedule 11 to OFCOM.
Amendment 2, page 198, line 9, leave out “functionalities” and insert “characteristics”.
Amendment 1, page 198, line 9, at end insert—
“(1A) In this schedule, “characteristics” of a service include its functionalities, user base, business model, governance and other systems and processes.”
Amendment 159, page 198, line 9, at end insert—
“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”
This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.
Amendment 36, page 198, line 10, leave out “The Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 37, page 198, line 16, leave out “The Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 3, page 198, line 2, leave out “functionalities” and insert “characteristics”.
Amendment 9, page 198, line 28, leave out “and” and insert “or”.
Amendment 4, page 198, line 29, leave out “functionality” and insert “characteristic”.
Amendment 38, page 198, line 32, leave out “the Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 5, page 198, line 34, leave out “functionalities” and insert “characteristics”.
Amendment 39, page 198, line 37, leave out “the Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 40, page 198, line 41, leave out “the Secretary of State” and insert “OFCOM”.
This amendment is consequential on Amendment 35.
Amendment 6, page 198, line 4, leave out “functionalities” and insert “characteristics”.
Amendment 7, page 199, line 11, leave out “functionalities” and insert “characteristics”.
Amendment 8, page 199, line 28, leave out “functionalities” and insert “characteristics”.
Amendment 41, page 199, line 3, leave out subparagraphs (5) to (11).
This amendment is consequential on Amendment 35.
Government amendments 230, 253 to 261 and 233.
I was about to speak to the programme motion, Mr Speaker, but you have outlined exactly what I was going to say, so thank you for that—I am glad to get the process right.
I am delighted to bring the Online Safety Bill back to the House for the continuation of Report stage. I start by expressing my gratitude to colleagues across the House for their contributions to the Bill through pre-legislative scrutiny and before the summer recess, and for their engagement with me since I took office as the Minister for Tech and the Digital Economy.
The concept at the heart of this legislation is simple: tech companies, like those in every other sector, must take responsibility for the consequences of their business decisions. As they continue to offer users the latest innovations, they must consider the safety of their users as well as profit. They must treat their users fairly and ensure that the internet remains a place for free expression and robust debate. As Members will be aware, the majority of the Bill was discussed on Report before the summer recess. Our focus today is on the provisions that relate to the regulator’s power and the criminal law reforms. I will take this opportunity also to briefly set out the further changes that the Government recently committed to making later in the Bill’s passage.
Let me take the Government amendments in turn. The Government’s top priority for this legislation has always been the protection of children. We recognise that the particularly abhorrent and pernicious nature of online child sexual exploitation and abuse—CSEA—demands the most robust response possible. Throughout the passage of the Bill, we have heard evidence of the appalling harm that CSEA causes. Repeatedly, we heard calls for strong incentives for companies to do everything they can to innovate and make safety technologies their priority, to ensure that there is no place for offenders to hide online. The Bill already includes a specific power to tackle CSEA, which allows Ofcom, subject to safeguards, to require tech companies to use accredited technology to identify and remove illegal CSEA content in public and private communications. However, we have seen in recent years how the online world has evolved to allow offenders to reach their victims and one another in new ways.
I am listening to my hon. Friend with great interest on this aspect of child sexual abuse and exploitation, which is a heinous crime. Will he go on to speak about how the Ofcom role will interact with law enforcement, in particular the National Crime Agency, when dealing with these awful crimes?
It is important that we tackle this in a number of ways. My right hon. Friend the Member for Haltemprice and Howden (Mr Davis) and I spoke earlier, and I will come to some of what he will outline. It is important that Ofcom recognises the technologies that are available and—with the Children’s Commissioner as one of the statutory consultees—liaises with the social media platforms, and the agencies, to ensure that there are codes of practice that work, and that we get this absolutely right. It is about enforcing the terms and conditions of the companies and being able to produce the evidence and track the exchanges, as I will outline later, for the agency to use for enforcement.
With the rapid developments in technology, on occasions there will be no existing accredited technology available that will satisfactorily mitigate the risks. Similarly, tech companies might be able to better design solutions that integrate more easily with their services than those that are already accredited. The new regulatory framework must incentivise tech companies to ensure that their safety measures keep pace with the evolving threat, and that they design their services to be safe from the outset. It is for these reasons that the Government have tabled the amendments that we are discussing.
New clauses 11 and 12 establish options for Ofcom when deploying its powers under notices to deal with terrorism content and CSEA content. These notices will empower Ofcom to require companies to use accredited technology to identify and remove illegal terrorism and CSEA content or to prevent users from encountering that content or, crucially, to use their best endeavours to develop or to source technology to tackle CSEA. That strikes the right balance of supporting the adoption of new technology, while ensuring that it does not come at the expense of children’s physical safety.
Terrorism is often linked to non-violent extremism, which feeds into violent extremism and terrorism. How does the Bill define extremism? Previous Governments failed to define it, although it is often linked to terrorism.
This Bill links with other legislation, and obviously the agencies. We do not seek to redefine extremism where those definitions already exist. As we expand on the changes that we are making, we will first ensure that anything that is already illegal goes off the table. Anything that is against the terms and conditions of those platforms that are hosting that content must not be seen. I will come to the safety net and user protection later.
Since Elon Musk’s takeover of Twitter, hate speech has ballooned on the platform and the number of staff members at Twitter identifying images of child sexual abuse and exploitation has halved. How can the Minister be sure that the social media companies are able to mark their own homework in the way that he suggests?
Because if those companies do not, they will get a fine of up to £18 million or 10% of their global turnover, whichever is higher. As we are finding with Twitter, there is also a commercial impetus, because advertisers are fleeing that platform as they see the uncertainty being caused by those changes. A lot of things are moving here to ensure that safety is paramount; it is not just for the Government to act in this area. All we are doing is making sure that those companies enforce their own terms and conditions.
This point is important: we are speaking about terrorism and counter-terrorism and the state’s role in preventing terrorist activity. For clarity, will the Minister update the House later on the work that takes place between his Department and the platforms and, importantly, between the Home Office and the security services. In particular, some specialist work takes place with the Global Internet Forum to Counter Terrorism, which looks at online terrorist and extremist content. That work can ensure that crimes are prevented and that the right kinds of interventions take place.
My right hon. Friend talks with experience from her time at the Home Office. She is absolutely right that the Bill sets a framework to adhere to the terms and conditions of the platforms. It also sets out the ability for the services to look at things such as terrorism and CSEA, which I have been talking about—for example, through the evidence of photos being exchanged. The Bill is not re-examining and re-prosecuting the interaction between all the agencies, however, because that is apparent for all to see.
New clauses 11 and 12 bring those powers in line with the wider safety duties by making it clear that the tools may seek to proactively prevent CSEA content from appearing on a service, rather than focusing only on identification and removal after the fact. That will ensure the best possible protection for children, including on services that offer livestreaming.
The safeguards around those powers remain as strong as before to protect user privacy. Any tools that are developed will be accredited using a rigorous assessment process to ensure that they are highly accurate before the company is asked to use them. That will avoid any unnecessary intrusions into user privacy by minimising the risk that the tools identify false positives.
Crucially, the powers do not represent a ban on or seek to undermine any specific type of technology or design, such as end-to-end encryption. They align with the UK Government’s view that online privacy and cyber-security must be protected, but that technological changes should not be implemented in a way that diminishes public safety.
Can the Minister expand on the notion of “accredited technology”? The definition in the Bill is pretty scant as to where it will emerge from. Is he essentially saying that he is relying on the same industry that has thus far presided over the problem to produce the technology that will police it for us? Within that equation, which seems a little self-defeating, is it the case that if the technology does not emerge for one reason or another—commercial or otherwise—the Government will step in and devise, fund or otherwise create the technology required to be implemented?
I thank my right hon. Friend. It is the technology sector that develops technology—it is a simple, circular definition—not the Government. We are looking to make sure that it has that technology in place, but if we prescribed it in the Bill, it would undoubtedly be out of date within months, never mind years. That is why it is better for us to have a rounded approach, working with the technology sector, to ensure that it is robust enough.
I may not have been clear in my original intervention: my concern is that the legislation relies on the same sector that has thus far failed to regulate itself and failed to invent the technology that is required, even though it is probably perfectly capable of doing so, to produce the technology that we will then accredit to be used. My worry is that the sector, for one reason or another—the same reason that it has not moved with alacrity already to deal with these problems in the 15 years or so that it has existed—may not move at the speed that the Minister or the rest of us require to produce the technology for accreditation. What happens if it does not?
Clearly, the Government can choose to step in. We are setting up a framework to ensure that we get the right balance and are not being prescriptive. I take issue with the idea that a lot of this stuff has not been invented, because there is some pretty robust work on age assurance and verification, and other measures to identify harmful and illegal material, although my right hon. Friend is right that it is not being used as robustly as it could be. That is exactly what we are addressing in the Bill.
My intervention is on the same point as that raised by my right hon. Friend the Member for North West Hampshire (Kit Malthouse), but from the opposite direction, in effect. What if it turns out that, as many security specialists and British leaders in security believe—not just the companies, but professors of security at Cambridge and that sort of thing—it is not possible to implement such measures without weakening encryption? What will the Minister’s Bill do then?
The Bill is very specific with regard to encryption; this provision will cover solely CSEA and terrorism. It is important that we do not encroach on privacy.
I welcome my hon. Friend to his position. Under the Bill, is it not the case that if a company refuses to use existing technologies, that will be a failure of the regulatory duties placed on that company? Companies will be required to demonstrate which technology they will use and will have to use one that is available. On encrypted messaging, is it not the case that companies already gather large amounts of information about websites that people visit before and after they send a message that could be hugely valuable to law enforcement?
My hon. Friend is absolutely right. Not only is it incumbent on companies to use that technology should it exist; if they hamper Ofcom’s inquiries by not sharing information about what they are doing, what they find and which technologies they are not using, that will be a criminal liability under the Bill.
To take that one step further, is it correct that Ofcom would set minimum standards for operators? For example, the Content Authenticity Initiative does not need primary legislation, but is an industry open-standard, open-source format. That is an example of modern technology that all companies could sign up to use, and Ofcom would therefore determine what needs to be done in primary legislation.
Can I be helpful? We did say that our discussions should be within scope, but the Minister is tempting everybody to intervene out of scope. From his own point of view, I would have thought that it would be easier to keep within scope.
Thank you, Mr Speaker; I will just respond to my hon. Friend the Member for Bosworth (Dr Evans). There is a minimum standard in so far as the operators have to adhere to the terms of the Bill. Our aim is to exclude illegal content and ensure that children are as safe as possible within the remit of the Bill.
The changes will ensure a flexible approach so that companies can use their expertise to develop or source the most effective solution for their service, rather than us being prescriptive. That, in turn, supports the continued growth of our digital economy while keeping our citizens safe online.
My hon. Friend may know that there are third-party technology companies—developers of this accredited technology, as he calls it—that do not have access to all the data that might be necessary to develop technology to block the kind of content we are discussing. They need to be given the right to access that data from the larger platforms. Will Ofcom be able to instruct large platforms that have users’ data to make it available to third-party developers of technology that can help to block such content?
Ofcom will be working with the platforms over the next few months—in the lead-up to the commencement of the Bill and afterwards—to ensure that the provisions are operational, so that we get them up and running as soon as practicably possible. My right hon. Friend is right to raise the point.
In Northern Ireland we face the specific issue of the glorification of terrorism. Glorifying terrorism encourages terrorism. Is it possible that the Bill will stop that type of glorification, and therefore stop the terrorism that comes off the back of it?
I will try to cover the hon. Member’s comments a little bit later, if I may, when I talk about some of the changes coming up later in the process.
Moving away from CSEA, I am pleased to say that new clause 53 fulfils a commitment given by my predecessor in Committee to bring forward reforms to address epilepsy trolling. It creates the two specific offences of sending and showing flashing images to an individual with epilepsy with the intention of causing them harm. Those offences will apply in England, Wales and Northern Ireland, providing people with epilepsy with specific protection from this appalling abuse. I would like to place on record our thanks to the Epilepsy Society for working with the Ministry of Justice to develop the new clause.
The offence of sending flashing images captures situations in which an individual sends a communication in a scatter-gun manner—for example, by sharing a flashing image on social media—and the more targeted sending of flashing images to a person who the sender knows or suspects is a person with epilepsy. It can be committed by a person who forwards or shares such an electronic communication as well as by the person sending it. The separate offence of showing flashing images will apply if a person shows flashing images to someone they know or suspect to have epilepsy by means of an electronic communications device—for example, on a mobile phone or a TV screen.
The Government have listened to parliamentarians and stakeholders about the impact and consequences of this reprehensible behaviour, and my thanks go to my hon. Friends the Members for Watford (Dean Russell), for Stourbridge (Suzanne Webb), for Blackpool North and Cleveleys (Paul Maynard) and for Ipswich (Tom Hunt) for their work and campaigning. [Interruption.] Indeed, and the hon. Member for Batley and Spen (Kim Leadbeater), who I am sure will be speaking on this later.
New clause 53 creates offences that are legally robust and enforceable so that those seeking to cause harm to people with epilepsy will face appropriate criminal sanctions. I hope that will reassure the House that the deeply pernicious activity of epilepsy trolling will be punishable by law.
The Minister is thanking lots of hon. Members, but should not the biggest thanks go, first, to the Government for the inclusion of this amendment; and secondly, to Zach Eagling, the inspirational now 11-year-old who was the victim of a series of trolling incidents when flashing images were pushed his way after a charity walk? We have a huge amount to thank Zach Eagling for, and of course the amazing Epilepsy Society too.
A number of Members across the House have been pushing for Zach’s law, and I am really delighted that Zach’s family can see in Hansard that that campaigning has really made a direct change to the law.
I just want to echo the previous points. This has been a hard-fought decision, and I am so proud that the Government have done this, but may I echo the thanks to Zach for being a true hero? We talk about David and Goliath, the giant—the beast—who was taken down, but Zach has beaten the tech giants, and I think this is an incredible success.
I absolutely echo my hon. Friend’s remarks, and I again thank him for his work.
We are also taking steps to strengthen Ofcom’s enforcement powers, which is why we are giving Ofcom a discretionary power to require non-compliant services to publish or notify their users of enforcement action that it has taken against the service. Ofcom will be able to use this power to direct a service to publish details or notify its UK users about enforcement notices it receives from Ofcom. I thank the Antisemitism Policy Trust for bringing this proposal to our attention and for its helpful engagement on the issue. This new power will promote transparency by increasing awareness among users about breaches of the duty in the Bill. It will help users make much more informed decisions about the services they use, and act as an additional deterrent factor for service providers.
It is fantastic to have the data released. Does the Minister have any idea how many of these notifications are likely to be put out there when the Bill comes in? Has any work been done on that? Clearly, having thousands of these come out would be very difficult for the public to understand, but half a dozen over a year might be very useful to understand which companies are struggling.
I think this is why Ofcom has discretion, so that it can determine that. The most egregious examples are the ones people can learn from, and it is about doing this in proportion. My hon. Friend is absolutely right that if we are swamped with small notifications, this will be hidden in plain sight. That would not be useful, particularly for parents, to best understand what is going on. It is all about making more informed decisions.
The House will be aware that we recently announced our intention to make a number of other changes to the Bill. We are making those changes because we believe it is vital that people can continue to express themselves freely and engage in pluralistic debate online. That is why the Bill will be amended to strengthen its provisions relating to children and to ensure that the Bill’s protections for adults strike the right balance with its protections for free speech.
The Minister is alluding, I assume, to the legal but harmful provision, but what does he think about this as an example? People are clever; they do not use illegal language. They will not say, “I want to kill all Jews”, but they may well—and do—say, “I want to harm all globalists.” What is the Minister’s view of that?
The right hon. Lady and I have had a detailed chat about some of the abuse that she and many others have been suffering, and there were some particularly egregious examples. This Bill is not, and never will be, a silver bullet. This has to be worked through, with the Government acting with media platforms and social media platforms, and parents also have a role. This will evolve, but we first need to get back to the fundamental point that social media platforms are not geared up to enforce their own terms and conditions. That is ridiculous, a quarter of a century after the world wide web kicked in, and when social media platforms have been around for the best part of 20 years. We are shutting the stable door afterwards, and trying to come up with legislation two decades later.
Order. I am really bothered. I am trying to help the Minister, because although broadening discussion of the Bill is helpful, it is also allowing Members to come in with remarks that are out of scope. If we are going to go out of scope, we could be here a long time. I am trying to support the Minister by keeping him in scope.
Thank you, Mr Speaker; I will try to keep my remarks very much in scope.
The harmful communications offence in clause 151 was a reform to communication offences proposed in the Bill. Since the Bill has been made public, parliamentarians and stakeholders have expressed concern that the threshold that would trigger prosecution for the offence of causing serious distress could bring robust but legitimate conversation into the illegal space. In the light of that concern, we have decided not to take forward the harmful communications offence for now. That will give the Government an opportunity to consider further how the criminal law can best protect individuals from harmful communications, and ensure that protections for free speech are robust.
This is about the protection of young people, and we are all here for the same reason, including the Minister. We welcome the changes that he is putting forward, but the Royal College of Psychiatrists has expressed a real concern about the mental health of children, and particularly about how screen time affects them. NHS Digital has referred to one in eight 11 to 16-year-olds being bullied. I am not sure whether we see in the Bill an opportunity to protect them, so perhaps the Minister can tell me the right way to do that.
The hon. Gentleman talks about the wider use of screens and screen time, and that is why Ofcom’s media literacy programme, and DCMS’s media literacy strategy—
That is because we have a detailed strategy that tackles many of these issues. Again, none of this is perfect, and as I have said, the Government are working in tandem with the platforms, and with parents and education bodies, to make sure we get that bit right. The hon. Gentleman is right to highlight that as a big issue.
I talked about harmful communications, recognising that we could leave a potential gap in the criminal law. The Government have also decided not to repeal existing communications offences in the Malicious Communications Act 1988, or those under section 127(1) of the Communications Act 2003. That will ensure that victims of domestic abuse or other extremely harmful communications will still be robustly protected by the criminal law. Along with planned changes to the harmful communications offence, we are making a number of additional changes to the Bill—that will come later, Mr Speaker, and I will not tread too much into that, as it includes the removal of the adult safety duties, often referred to as the legal but harmful provision. The amended Bill offers adults a triple shield of protection that requires platforms to remove illegal content and material that violates their terms and conditions, and gives adults user controls to help them avoid seeing certain types of content.
The Bill’s key objective, above everything else, is the safety of children online, and we will be making a number of changes to strengthen the Bill’s existing protections for children. We will make sure that we expect platforms to use age assurance technology when identifying the age of their users, and we will also require platforms with minimum age restrictions to explain in their terms of service what measures they have in place to prevent access to those below their minimum age, and enforce those measures consistently. We are planning to name the Children’s Commissioner as a statutory consultee for Ofcom in its development of the codes of practice, ensuring that children’s views and needs are represented.
That is the Children’s Commissioner for England, specifically because they have particular reserved duties for the whole of the UK. None the less, Ofcom must also have regard to a wider range of voices, which can easily include the other Children’s Commissioners.
On age reassurance, does the Minister not see a weakness? Lots of children and young people are far more sophisticated than many of us in the Chamber and will easily find a workaround, as they do now. The onus is being put on the children, so the Bill is not increasing regulation or the safety of those children.
As I said, the social media platforms will have to put in place robust age assurance and age verification for material in an accredited form that is acceptable to Ofcom, which will look at that.
Tackling violence against women and girls is a key priority for the Government. It is unacceptable that women and girls suffer disproportionately from abuse online, and it is right that we go further to address that through the Bill. That is why we will name the commissioner for victims and witnesses and the Domestic Abuse Commissioner as statutory consultees for the code of practice and list “coercive or controlling behaviour” as a priority offence. That offence disproportionately affects women and girls, and that measure will mean that companies will have to take proactive measures to tackle such content.
Finally, we are making a number of criminal law reforms, and I thank the Law Commission for the great deal of important work that it has done to assess the law in these areas.
I strongly welcome some of the ways in which the Bill has been strengthened to protect women and girls, particularly by criminalising cyber-flashing, for example. Does the Minister agree that it is vital that our laws keep pace with the changes in how technology is being used? Will he therefore assure me that the Government will look to introduce measures along the lines set out in new clauses 45 to 50, standing in the name of my right hon. Friend the Member for Basingstoke (Dame Maria Miller), who is leading fantastic work in this area, so that we can build on the Government’s record in outlawing revenge porn and threats to share it?
I thank my hon. Friend, and indeed I thank my right hon. Friend the Member for Basingstoke (Dame Maria Miller) for the amazing work that she has done in this area. We will table an amendment to the Bill to criminalise more behaviour relating to intimate image abuse, so more perpetrators will face prosecution and potentially time in jail. My hon. Friend has worked tirelessly in this area, and we have had a number of conversations. I thank her for that. I look forward to more conversations to ensure that we get the amendment absolutely right and that it does exactly what we all want.
The changes we are making will include criminalising the non-consensual sharing of manufactured intimate images, which, as we have heard, are more commonly known as deepfakes. In the longer term, the Government will also take forward several of the Law Commission’s recommendations to ensure that the legislation is coherent and takes account of advancements in technology.
We will also use the Bill to bring forward a further communication offence to make the encouragement of self-harm illegal. We have listened to parliamentarians and stakeholders concerned about such behaviour and will use the Bill to criminalise that activity, providing users with protections from that harmful content. I commend my right hon. Friend the Member for Haltemprice and Howden on his work in this area and his advocacy for such a change.
Intimate image abuse has been raised with me a number of times by younger constituents, who are particularly vulnerable to such abuse. Within the scope of what we are discussing, I am concerned that we have seen only one successful conviction for revenge porn, so if the Government base their intimate image work on the existing legislative framework for revenge porn, it will do nothing and protect no one, and will instead be a waste of everyone’s time and further let down victims who are already let down by the system.
We will actually base that work on the independent Law Commission’s recommendations, and have been working with it on that basis.
On images that promote self-harm, does the Minister agree that images that promote or glamourise eating disorders should be treated just as seriously as any other content promoting self-harm?
I thank my right hon. Friend, who spoke incredibly powerfully at Digital, Culture, Media and Sport questions, and on a number of other occasions, about her particular experience. That is always incredibly difficult. Absolutely that area will be tackled, especially for children, but it is really important—as we will see from further changes in the Bill—that, with the removal of the legal but harmful protections, there are other protections for adults.
I think last year over 6,000 people died from suicide in the UK. Much of that, sadly, was encouraged by online content, as we saw from the recent coroner’s report into the tragic death of Molly Russell. On new clause 16, tabled by my right hon. Friend the Member for Haltemprice and Howden (Mr Davis), will the Minister confirm that the Government agree with the objectives of new clause 16 and will table an amendment to this Bill—to no other parliamentary vehicle, but specifically to this Bill—to introduce such a criminal offence? Will the Government amendment he referred to be published before year end?
On self-harm, I do not think there is any doubt that we are absolutely aligned. On suicide, I have some concerns about how new clause 16 is drafted—it amends the Suicide Act 1961, which is not the right place to introduce measures on self-harm—but I will work to ensure we get this measure absolutely right as the Bill goes through the other place.
I thank my hon. Friend for giving way. He is almost being given stereo questions from across the House, but I think they might be slightly different. I am very grateful to him for setting out his commitment to tackling suicide and self-harm content, and for his commitment to my right hon. Friend the Member for Chelmsford (Vicky Ford) on eating disorder content. My concern is that there is a really opaque place in the online world between what is legal and illegal, which potentially could have been tackled by the legal but harmful restrictions. Can he set out a little more clearly—not necessarily now, but as we move forward—how we really are going to begin to tackle the opaque world between legal and illegal content?
If my hon. Friend will bear with me—I need to make some progress—I think that will be teased out today and in Committee, should the Bill be recommitted, as we amend the clauses relating directly to what she is talking about, and then as the Bill goes through the other place.
I am grateful to the Minister, who has taken a number of interventions. I fully agree with my hon. Friend the Member for Gosport (Dame Caroline Dinenage). This is a grey area and has consistently been so—many Members have given their views on that in previous stages of the Bill. Will the Minister come back in the later stages on tackling violence against women and girls, and show how the Bill will incorporate key aspects of the Domestic Abuse Act 2021, and tie up with the criminal justice system and the work of the forthcoming victims Bill? We cannot look at these issues in isolation—I see that the Minister of State, Ministry of Justice, my right hon. Friend the Member for Charnwood (Edward Argar) is also on the Front Bench. Rather, they all have to be put together in a golden thread of protecting victims, making sure that people do not become victims, and ensuring that we go after the perpetrators—we must not forget that at all. The Minister will not be able to answer that now, but I would ask him to please do so in the latter stages.
I talked about the fact that the Commissioner for Victims and Witnesses and the Domestic Abuse Commissioner will be statutory consultees, because it is really important that their voice is heard in the implementation of the Bill. We are also bringing in coercive control as one of the areas. That is so important when it comes to domestic abuse. Domestic abuse does not start with a slap, a hit, a punch; it starts with emotional abuse—manipulation, coercion and so on. That is why coercive abuse is an important point not just for domestic abuse, but for bullying, harassment and the wider concerns that the Bill seeks to tackle.
I am one of three Scottish Members present, and the Scottish context concerns me. If time permits me in my contribution later, I will touch on a particularly harrowing case. The school involved has been approached but has done nothing. Education is devolved, so the Minister may want to think about that. It would be too bad if the Bill failed in its good intentions because of a lack of communication in relation to a function delivered by the Scottish Government. Can I take it that there will be the closest possible co-operation with the Scottish Government because of their educational responsibilities?
There simply has to be. These are global companies and we want to make the Bill work for the whole of the UK. This is not an England-only Bill, so the changes must happen for every user, whether they are in Scotland, Northern Ireland, Wales or England.
Will the Minister give way?
I will make a bit of progress, because I am testing Mr Speaker’s patience.
We are making a number of technical amendments to ensure that the new communications offences are targeted and effective. New clause 52 seeks to narrow the exemptions for broadcast and wireless telegraphy licence holders and providers of on-demand programme services, so that the licence holder is exempt only to the extent that communication is within the course of a licensed activity. A separate group of technical amendments ensure that the definition of sending false and threatening communications will capture all circumstances—that is far wider than we have at the moment.
We propose a number of consequential amendments to relevant existing legislation to ensure that new offences operate consistently with the existing criminal law. We are also making a number of wider technical changes to strengthen the enforcement provisions and ensure consistency with other regulatory frameworks. New clause 42 ensures that Ofcom has the power to issue an enforcement notice to a former service provider, guarding against service providers simply shutting down their business and reappearing in a slightly different guise to avoid regulatory sanction. A package of Government amendments will set out how the existing video-sharing platform regime will be repealed and the transitional provisions that will apply to those providers as they transition to the online safety framework.
Finally, new clause 40 will enable the CMA to share information with Ofcom for the purpose of facilitating Ofcom’s online safety functions. That will help to ensure effective co-operation between Ofcom and the CMA.
I thank my hon. Friend for giving way. In the past 40 minutes or so, he has demonstrated the complexity of the changes that are being proposed for the Bill, and he has done a very good job in setting that out. However, will he join me and many other right hon. and hon. Members who feel strongly that a Standing Committee should look at the Bill’s implementation, because of the complexities that he has so clearly demonstrated? I know that is a matter for the House rather than our consideration of the Bill, but I hope that other right hon. and hon. Members will join me in looking for ways to put that right. We need to be able to scrutinise the measures on an ongoing basis.
Indeed, there will be, and are, review points in the Bill. I have no doubt that my right hon. Friend will raise that on other occasions as well.
I want to ensure that there is plenty of time for Members to debate the Bill at this important stage, and I have spoken for long enough. I appreciate the constructive and collaborative approach that colleagues have taken throughout the Bill’s passage.
I am grateful to the Minister. Does he support Baroness Kidron’s amendment asking for swift, humane access to data where there is a suspicion that online information may have contributed to a child’s suicide? That has not happened in previous instances; does he support that important amendment?
I am glad that I gave way so that the hon. Lady could raise that point. Baroness Kidron and her organisation have raised that issue with me directly, and they have gathered media support. We will look at that as the Bill goes through this place and the Lords, because we need to see what the powers are at the moment and why they are not working.
Now is the time to take this legislation forward to ensure that it can deliver the safe and transparent online environment that children and adults so clearly deserve.
Exactly that. My hon. Friend is absolutely right. I come back to the point about drafting this legislation, which is not straightforward and easy because of the definitions. It is not just about what is in scope of the Bill but about the implications of the definitions and how they could be applied in law.
The Minister touched on the criminal side of things; interpretation in the criminal courts and how that would be applied in case law are the points that need to be fleshed out. This is where our work on CT is so important, because across the world with Five Eyes we have been consistent. Again, there are good models out there that can be built upon. We will not fix all this through one Bill—we know that. This Bill is foundational, which is why we must move forward.
On new clause 11, I seek clarity—in this respect, I need reassurance not from the Minister but from other parts of government—on how victims and survivors, whether of terrorist activity, domestic abuse or violence against women and girls, will be supported and protected by the new safeguards in the Bill, and by the work of the Victims’ Commissioner.
I thank my right hon. Friend for sharing her remarks with the House. She is making an excellent speech based on her considerable experience. On the specific issue of child sexual abuse and exploitation, many organisations, such as the Internet Watch Foundation, are instrumental in removing reports and web pages containing that vile and disgusting material. In the April 2020 White Paper, the Government committed to look at how the Internet Watch Foundation could use its technical expertise in that field. Does she agree that it would be good to hear from the Minister about how the Internet Watch Foundation could work with Ofcom to assist victims?
I was not aware of that, but I am now. I thank my hon. Friend for that information. This is a crucial point. We need the accountability of the named director associated with the company, the platform and the product in order to introduce the necessary accountability. I do not know whether the Minister will accept this new clause today, but I very much hope that we will look further at how we can make this possible, perhaps in another place.
I very much support the Bill. We need to get it on the statute book, although it will probably need further work, and I support the Government amendments. However, given the link between children viewing pornography and child sexual abuse, I hope that when the Bill goes through the other place, their lordships will consider how regulations around pornographic content can be strengthened, in order to drastically reduce the number of children viewing porn and eventually being drawn into criminal activities themselves. In particular, I would like their lordships to look at tightening and accelerating the age verification and giving equal treatment to all pornography, whether it is on a porn site or a user-to-user service and whether it is online or offline. Porn is harmful to children in whatever form it comes, so the liability on directors and the criminality must be exactly the same. I support the Bill and the amendments in the Government’s name, but it needs to go further when it goes to the other place.
I thank Members for their contributions during today’s debate and for their ongoing engagement with such a crucial piece of legislation. I will try to respond to as many of the issues raised as possible.
My right hon. Friend the Member for Haltemprice and Howden (Mr Davis), who is not in his place, proposed adding in promoting self-harm as a criminal offence. The Government are sympathetic to the intention behind that proposal; indeed, we asked the Law Commission to consider how the criminal law might address that, and have agreed in principle to create a new offence of encouraging or assisting serious self-harm. The form of the offence recommended by the Law Commission is based on the broadly comparable offence of encouraging or assisting suicide. Like that offence, it covers the encouragement of, or assisting in, self-harm by means of communication and in other ways. When a similar amendment was tabled by the hon. Members for Ochil and South Perthshire (John Nicolson) and for Aberdeen North (Kirsty Blackman) in Committee, limiting the offence to encouragement or assistance by means of sending a message, the then Minister, my right hon. Friend the Member for Croydon South, said it would give only partial effect to the Law Commission’s recommendation. It remains the Government’s intention to give full effect to the Law Commission’s recommend-ations in due course.
I have raised this on a number of occasions in the past few hours, as have my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) and the right hon. Member for Barking (Dame Margaret Hodge). Will the Minister be good enough to ensure that this matter is thoroughly looked at and, furthermore, that the needed clarification is thought through?
I was going to come to my hon. Friend in two seconds.
In the absence of clearly defined offences, the changes we are making to the Bill mean that it is likely to be almost impossible to take enforcement action against individuals. We are confident that Ofcom will have all the tools necessary to drive the necessary culture change in the sector, from the boardroom down.
This is not the last stage of the Bill. It will be considered in Committee—assuming it is recommitted today—and will come back on Report and Third Reading before going to the House of Lords, so there is plenty of time further to discuss this and to give my hon. Friend the clarification he needs.
Is the Minister saying he is open to changing his view on why he is minded to reject new clause 17 tonight?
I do not think I am changing my view. I am saying that this is not the last stage of the Bill, so there will be plenty of opportunity further to test this, should Members want to do so.
On new clause 28, the Government recognise and agree with the intent behind this amendment to ensure that the interests of child users of regulated services are represented. Protecting children online is the top priority in this Bill, and its key measures will ensure that children are protected from harmful content. The Bill appoints a regulator with comprehensive powers to force tech companies to keep children safe online, and the Bill’s provisions will ensure that Ofcom will listen and respond to the needs of children when identifying priority areas for regulatory action, setting out guidance for companies, taking enforcement action and responding to super-complaints.
Right from the outset, Ofcom must ensure that its risk assessment and priorities reflect the needs of children. For example, Ofcom is required to undertake research that will help understand emerging risks to child safety. We have heard a lot today about the emerging risks with changing technology, and it is important that we keep on top of those and have that children’s voice at the heart of this. The Bill also expands the scope of the Communications Consumer Panel to online safety matters. That independent panel of experts ensures that user needs are at the heart of Ofcom’s regulatory approach. Ofcom will also have the flexibility to choose other mechanisms to better understand user experiences and emerging threats. For example, it may set up user panels or focus groups.
Importantly, Ofcom will have to engage with expert bodies representing children when developing codes of practice and other regulatory guidance. For example, Ofcom will be required to consult persons who represent the interests of children when developing its codes of practice. That means that Ofcom’s codes will be fully informed by how children behave online, how they experience harm and what impact the proposed measures will have on their online experience. The super-complaints process will further enable independent bodies advocating for children to have their voices heard, and will help Ofcom to recognise and eliminate systemic failures.
As we have heard, the Government also plan to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it develops its code of practice. That amendment will be tabled in the House of Lords. Through this consultation, the commissioner will be able to flag systemic issues or issues of particular importance to the regulator, helping Ofcom to target investigations and, if necessary, sanctions at matters that most affect children’s online experience.
As such, there are ample opportunities in the framework for children’s voices to be heard, and the Government are not convinced of the need to legislate for another child user advocacy body. There are plenty of bodies out there that Ofcom will already be reaching out to and there is an abundance of experience in committed representative groups that are already engaged and will be engaged with the online safety framework. They include the existing statutory body responsible for promoting the interests of children, the Children’s Commissioner. Adding an additional statutory body would duplicate existing provision, creating a confusing landscape, and that would not be in the best interests of children.
I hear what the Minister is saying about creating a statutory body, but will he assure this House that there is a specific vehicle for children’s voices to be heard in this? I ask because most of us here are not facing the daily traumas and constant recreation of different apps and social media ways to reach out to children that our children are. So unless we have their voice heard, this Bill is not going to be robust enough.
As I say, we are putting the Children’s Commissioner as a statutory consultee in the Bill. Ofcom will also have to have regard to all these other organisations, such as the 5Rights Foundation and the NSPCC, that are already there. It is in the legislation that Ofcom will have to have regard to those advocates, but we are not specifically suggesting that there should be a separate body duplicating that work. These organisations are already out there and Ofcom will have to reach out to them when coming up with its codes of practice.
We also heard from my hon. Friend the Member for Dover (Mrs Elphicke) about new clause 55. She spoke powerfully and I commend her for all the work she is doing to tackle the small boats problem, which is affecting so many people up and down this country. I will continue to work closely with her as the Bill continues its passage, ahead of its consideration in the Lords, to ensure that this legislation delivers the desired impact on the important issues of illegal immigration and modern slavery. The legislation will give our law enforcement agencies and the social media companies the powers and guidance they need to stop the promotion of organised criminal activity on social media. Clearly, we have to act.
My right hon. Friend the Member for Witham (Priti Patel), who brings to bear her experience as a former Home Secretary, spoke eloquently about the need to have joined-up government, to make sure that lots of bits of legislation and all Departments are working on this space. This is a really good example of joined-up government, where we have to join together.
Will the Minister confirm that, in line with the discussions that have been had, the Government will look to bring back amendments, should they be needed, in line with new clause 55 and perhaps schedule 7, as the Bill goes to the Lords or returns for further consideration in this House?
All that I can confirm is that we will work with my hon. Friend and with colleagues in the Home Office to make sure that this legislation works in the way that she intends.
We share with my right hon. Friend the Member for Basingstoke (Dame Maria Miller) the concern about the abuse of deep fake images and the need to tackle the sharing of intimate images where the intent is wider than that covered by current offences. We have committed to bring forward Government amendments in the Lords to do just that, and I look forward to working with her to ensure that, again, we get that part of the legislation exactly right.
We also recognise the intent behind my right hon. Friend’s amendment to provide funding for victim support groups via the penalties paid by entities for failing to comply with the regulatory requirements. Victim and survivor support organisations play a critical role in providing support and tools to help people rebuild their lives. That is why the Government continue to make record investments in this area, increasing the funding for victim and witness support services to £192 million a year by 2024-25. We want to allow the victim support service to provide consistency for victims requiring support.
I thank my hon. Friend for giving way and for his commitment to look at this matter before the Bill reaches the House of Lords. Can he just clarify to me that it is his intention to implement the Law Commission’s recommendations that are within the scope of the Bill prior to the Bill reaching the House of Lords? If that is the case, I am happy to withdraw my amendments.
I cannot confirm today at what stage we will legislate. We will continue to work with my right hon. Friend and the Treasury to ensure that we get this exactly right. We will, of course, give due consideration to the Law Commission’s recommendations.
Unless I am mistaken, no other stages of the Bill will come before the House where this can be discussed. Either it will be done or it will not. I had hoped that the Minister would answer in the affirmative.
I understand. We are ahead of the Lords on publication, so yes is the answer.
I have two very quick points for my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright). He was right to speak about acting with humility. We will bring forward amendments for recommittal to amend the approach for category 1 designation—not just the smaller companies that he was talking about, but companies that are pushing that barrier to get to category 1. I very much get his view that the process could be delayed unduly, and we want to make sure that we do not get the unintended consequences that he describes. I look forward to working with him to get the changes to the Bill to work exactly as he describes.
Finally, let me go back to the point that my right hon. Friend the Member for Haltemprice and Howden made about encrypted communications. We are not talking about banning end-to-end encryption or about breaking encryption—for the reasons set out about open banking and other areas. The amendment would leave Ofcom powerless to protect thousands of children and could leave unregulated spaces online for offenders to act, and we cannot therefore accept that.
Just briefly, because I know that the Minister is about to finish, can he respond on amendment 204 with regard to the protection of journalists?
I am happy to continue talking to the right hon. Gentleman, but I believe that we have enough protections in the Bill, with the human touch that we have added after the automatic flagging up of inquiries. The NCA will also have to have due regard to protecting sources. I will continue to work with him on that. ‘Online Safety Act 2022.”’—(Paul Scully.)
I have not covered everybody’s points, but this has been a very productive debate. I thank everyone for their contributions. We are really keen to get the Bill on the books and to act quickly to ensure that we can make children as safe as possible online.
Question put and agreed to.
New clause 11 accordingly read a Second time, and added to the Bill.
New Clause 12
Warning notices
‘(1) OFCOM may give a notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) to a provider relating to a service or part of a service only after giving a warning notice to the provider that they intend to give such a notice relating to that service or that part of it.
(2) A warning notice under subsection (1) relating to the use of accredited technology (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a) and (3)(a)) must—
(a) contain details of the technology that OFCOM are considering requiring the provider to use,
(b) specify whether the technology is to be required in relation to terrorism content or CSEA content (or both),
(c) specify any other requirements that OFCOM are considering imposing (see section 106(2) to (4)),
(d) specify the period for which OFCOM are considering imposing the requirements (see section 106(6)),
(e) state that the provider may make representations to OFCOM (with any supporting evidence), and
(f) specify the period within which representations may be made.
(3) A warning notice under subsection (1) relating to the development or sourcing of technology (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) and (3)(b)) must—
(a) describe the proposed purpose for which the technology must be developed or sourced (see section (Notices to deal with terrorism content or CSEA content (or both))(2)(a)(iii) and (iv) and (3)(a)(ii)),
(b) specify steps that OFCOM consider the provider needs to take in order to comply with the requirement described in section (Notices to deal with terrorism content or CSEA content (or both))(2)(b) or (3)(b), or both those requirements (as the case may be),
(c) specify the proposed period within which the provider must take each of those steps,
(d) specify any other requirements that OFCOM are considering imposing,
(e) state that the provider may make representations to OFCOM (with any supporting evidence), and
(f) specify the period within which representations may be made.
(4) A notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) that relates to both the user-to-user part of a combined service and the search engine of the service (as described in section (Notices to deal with terrorism content or CSEA content (or both))(4)(c) or (d)) may be given to the provider of the service only if—
(a) two separate warning notices have been given to the provider (one relating to the user-to-user part of the service and the other relating to the search engine), or
(b) a single warning notice relating to both the user-to-user part of the service and the search engine has been given to the provider.
(5) A notice under section (Notices to deal with terrorism content or CSEA content (or both))(1) may not be given to a provider until the period allowed by the warning notice for the provider to make representations has expired.’—(Paul Scully.)
This clause, which would follow NC11, also replaces part of existing clause 104. There are additions to the warning notice procedure to take account of the new options for notices under NC11.
Brought up, read the First and Second time, and added to the Bill.
New Clause 20
OFCOM’s reports about news publisher content and journalistic content
‘(1) OFCOM must produce and publish a report assessing the impact of the regulatory framework provided for in this Act on the availability and treatment of news publisher content and journalistic content on Category 1 services (and in this section, references to a report are to a report described in this subsection).
(2) Unless the Secretary of State requires the production of a further report (see subsection (6)), the requirement in subsection (1) is met by producing and publishing one report within the period of two years beginning with the day on which sections (Duties to protect news publisher content) and 16 come into force (or if those sections come into force on different days, the period of two years beginning with the later of those days).
(3) A report must, in particular, consider how effective the duties to protect such content set out in sections (Duties to protect news publisher content) and 16 are at protecting it.
(4) In preparing a report, OFCOM must consult—
(a) persons who represent recognised news publishers,
(b) persons who appear to OFCOM to represent creators of journalistic content,
(c) persons who appear to OFCOM to represent providers of Category 1 services, and
(d) such other persons as OFCOM consider appropriate.
(5) OFCOM must send a copy of a report to the Secretary of State, and the Secretary of State must lay it before Parliament.
(6) The Secretary of State may require OFCOM to produce and publish a further report if the Secretary of State considers that the regulatory framework provided for in this Act is, or may be, having a detrimental effect on the availability and treatment of news publisher content or journalistic content on Category 1 services.
(7) But such a requirement may not be imposed—
(a) within the period of three years beginning with the date on which the first report is published, or
(b) more frequently than once every three years.
(8) For further provision about reports under this section, see section 138.
(9) In this section—
“journalistic content” has the meaning given by section 16;
“news publisher content” has the meaning given by section 49;
“recognised news publisher” has the meaning given by section 50.
(10) For the meaning of “Category 1 service”, see section 82 (register of categories of services).’—(Paul Scully.)
This inserts a new clause (after clause 135) which requires Ofcom to publish a report on the impact of the regulatory framework provided for in the Bill within two years of the relevant provisions coming into force. It also allows the Secretary of State to require Ofcom to produce further reports.
Brought up, read the First and Second time, and added to the Bill.
New Clause 40
Amendment of Enterprise Act 2002
‘In Schedule 15 to the Enterprise Act 2002 (enactments relevant to provisions about disclosure of information), at the appropriate place insert—
This amendment has the effect that the information gateway in section 241 of the Enterprise Act 2002 allows disclosure of certain kinds of information by a public authority (such as the Competition and Markets Authority) to OFCOM for the purposes of OFCOM’s functions under this Bill.
Brought up, read the First and Second time, and added to the Bill.
New Clause 42
Former providers of regulated services
‘(1) A power conferred by Chapter 6 of Part 7 (enforcement powers) to give a notice to a provider of a regulated service is to be read as including power to give a notice to a person who was, at the relevant time, a provider of such a service but who has ceased to be a provider of such a service (and that Chapter and Schedules 13 and 15 are to be read accordingly).
(2) “The relevant time” means—
(a) the time of the failure to which the notice relates, or
(b) in the case of a notice which relates to the requirement in section 90(1) to co-operate with an investigation, the time of the failure or possible failure to which the investigation relates.’—(Paul Scully.)
This new clause, which is intended to be inserted after clause 162, provides that a notice that may be given under Chapter 6 of Part 7 to a provider of a regulated service may also be given to a former provider of a regulated service.
Brought up, read the First and Second time, and added to the Bill.
New Clause 43
Amendments of Part 4B of the Communications Act
‘Schedule (Amendments of Part 4B of the Communications Act) contains amendments of Part 4B of the Communications Act.’—(Paul Scully.)
This new clause introduces a new Schedule amending Part 4B of the Communications Act 2003 (see NS2).
Brought up, read the First and Second time, and added to the Bill.
New Clause 44
Repeal of Part 4B of the Communications Act: transitional provision etc
‘(1) Schedule (Video-sharing platform services: transitional provision etc) contains transitional, transitory and saving provision—
(a) about the application of this Act and Part 4B of the Communications Act during a period before the repeal of Part 4B of the Communications Act (or, in the case of Part 3 of Schedule (Video-sharing platform services: transitional provision etc), in respect of charging years as mentioned in that Part);
(b) in connection with the repeal of Part 4B of the Communications Act.
(2) The Secretary of State may by regulations make transitional, transitory or saving provision of the kind mentioned in subsection (1)(a) and (b).
(3) Regulations under subsection (2) may amend or repeal—
(a) Part 2A of Schedule3;
(b) Schedule (Video-sharing platform services: transitional provision etc).
(4) Regulations under subsection (2) may, in particular, make provision about—
(a) the application of Schedule (Video-sharing platform services: transitional provision etc) in relation to a service if the transitional period in relation to that service ends on a date before the date when section 172 comes into force;
(b) the application of Part 3 of Schedule (Video-sharing platform services: transitional provision etc), including further provision about the calculation of a provider’s non-Part 4B qualifying worldwide revenue for the purposes of paragraph 19 of that Schedule;
(c) the application of Schedule 10 (recovery of OFCOM’s initial costs), and in particular how fees chargeable under that Schedule may be calculated, in respect of charging years to which Part 3 of Schedule (Video-sharing platform services: transitional provision etc) relates.’—(Paul Scully.)
This new clause introduces a new Schedule containing transitional provisions (see NS3), and provides a power for the Secretary of State to make regulations containing further transitional provisions etc.
Brought up, read the First and Second time, and added to the Bill.
New Clause 51
Publication by providers of details of enforcement action
‘(1) This section applies where—
(a) OFCOM have given a person (and not withdrawn) any of the following—
(i) a confirmation decision;
(ii) a penalty notice under section 119;
(iii) a penalty notice under section 120(5);
(iv) a penalty notice under section 121(6), and
(b) the appeal period in relation to the decision or notice has ended.
(2) OFCOM may give to the person a notice (a “publication notice”) requiring the person to—
(a) publish details describing—
(i) the failure (or failures) to which the decision or notice mentioned in subsection (1)(a) relates, and
(ii) OFCOM’s response, or
(b) otherwise notify users of the service to which the decision or notice mentioned in subsection (1)(a) relates of those details.
(3) A publication notice may require a person to publish details under subsection (2)(a) or give notification of details under subsection (2)(b) or both.
(4) A publication notice must—
(a) specify the decision or notice mentioned in subsection (1)(a) to which it relates,
(b) specify or describe the details that must be published or notified,
(c) specify the form and manner in which the details must be published or notified,
(d) specify a date by which the details must be published or notified, and
(e) contain information about the consequences of not complying with the notice.
(5) Where a publication notice requires a person to publish details under subsection (2)(a) the notice may also specify a period during which publication in the specified form and manner must continue.
(6) Where a publication notice requires a person to give notification of details under subsection (2)(b) the notice may only require that notification to be given to United Kingdom users of the service (see section 184).
(7) A publication notice may not require a person to publish or give notification of anything that, in OFCOM’s opinion—
(a) is confidential in accordance with subsections (8) and (9), or
(b) is otherwise not appropriate for publication or notification.
(8) A matter is confidential under this subsection if—
(a) it relates specifically to the affairs of a particular body, and
(b) publication or notification of that matter would or might, in OFCOM’s opinion, seriously and prejudicially affect the interests of that body.
(9) A matter is confidential under this subsection if—
(a) it relates to the private affairs of an individual, and
(b) publication or notification of that matter would or might, in OFCOM’s opinion, seriously and prejudicially affect the interests of that individual.
(10) A person to whom a publication notice is given has a duty to comply with it.
(11) The duty under subsection (10) is enforceable in civil proceedings by OFCOM—
(a) for an injunction,
(b) for specific performance of a statutory duty under section 45 of the Court of Session Act 1988, or
(c) for any other appropriate remedy or relief.
(12) For the purposes of subsection (1)(b) “the appeal period”, in relation to a decision or notice mentioned in subsection (1)(a), means—
(a) the period during which any appeal relating to the decision or notice may be made, or
(b) where such an appeal has been made, the period ending with the determination or withdrawal of that appeal.’—(Paul Scully.)
This new clause, which is intended to be inserted after clause 129, gives OFCOM the power to require a person to whom a confirmation decision or penalty notice has been given to publish details relating to the decision or notice or to otherwise notify service users of those details.
Brought up, read the First and Second time, and added to the Bill.
New Clause 52
Exemptions from offence under section 152
‘(1) A recognised news publisher cannot commit an offence under section 152.
(2) An offence under section 152 cannot be committed by the holder of a licence under the Broadcasting Act 1990 or 1996 in connection with anything done under the authority of the licence.
(3) An offence under section 152 cannot be committed by the holder of a multiplex licence in connection with anything done under the authority of the licence.
(4) An offence under section 152 cannot be committed by the provider of an on-demand programme service in connection with anything done in the course of providing such a service.
(5) An offence under section 152 cannot be committed in connection with the showing of a film made for cinema to members of the public.’—(Paul Scully.)
This new clause contains exemptions from the offence in clause 152 (false communications). The clause ensures that holders of certain licences are only exempt if they are acting as authorised by the licence and, in the case of Wireless Telegraphy Act licences, if they are providing a multiplex service.
Brought up, read the First and Second time, and added to the Bill.
New Clause 53
Offences of sending or showing flashing images electronically: England and Wales and Northern Ireland (No.2)
‘(1) A person (A) commits an offence if—
(a) A sends a communication by electronic means which consists of or includes flashing images (see subsection (13)),
(b) either condition 1 or condition 2 is met, and
(c) A has no reasonable excuse for sending the communication.
(2) Condition 1 is that—
(a) at the time the communication is sent, it is reasonably foreseeable that an individual with epilepsy would be among the individuals who would view it, and
(b) A sends the communication with the intention that such an individual will suffer harm as a result of viewing the flashing images.
(3) Condition 2 is that, when sending the communication—
(a) A believes that an individual (B)—
(i) whom A knows to be an individual with epilepsy, or
(ii) whom A suspects to be an individual with epilepsy,
will, or might, view it, and
(b) A intends that B will suffer harm as a result of viewing the flashing images.
(4) In subsections (2)(a) and (3)(a), references to viewing the communication are to be read as including references to viewing a subsequent communication forwarding or sharing the content of the communication.
(5) The exemptions contained in section (Exemptions from offence under section 152) apply to an offence under subsection (1) as they apply to an offence under section 152.
(6) For the purposes of subsection (1), a provider of an internet service by means of which a communication is sent is not to be regarded as a person who sends a communication.
(7) In the application of subsection (1) to a communication consisting of or including a hyperlink to other content, references to the communication are to be read as including references to content accessed directly via the hyperlink.
(8) A person (A) commits an offence if—
(a) A shows an individual (B) flashing images by means of an electronic communications device,
(b) when showing the images—
(i) A knows that B is an individual with epilepsy, or
(ii) A suspects that B is an individual with epilepsy,
(c) when showing the images, A intends that B will suffer harm as a result of viewing them, and
(d) A has no reasonable excuse for showing the images.
(9) An offence under subsection (1) or (8) cannot be committed by a healthcare professional acting in that capacity.
(10) A person who commits an offence under subsection (1) or (8) is liable—
(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);
(b) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding six months or a fine not exceeding the statutory maximum (or both);
(c) on conviction on indictment, to imprisonment for a term not exceeding five years or a fine (or both).
(11) It does not matter for the purposes of this section whether flashing images may be viewed at once (for example, a GIF that plays automatically) or only after some action is performed (for example, pressing play).
(12) In this section—
(a) references to sending a communication include references to causing a communication to be sent;
(b) references to showing flashing images include references to causing flashing images to be shown.
(13) In this section—
“electronic communications device” means equipment or a device that is capable of transmitting images by electronic means;
“flashing images” means images which carry a risk that an individual with photosensitive epilepsy who viewed them would suffer a seizure as a result;
“harm” means—
(a) a seizure, or
(b) alarm or distress;
“individual with epilepsy” includes, but is not limited to, an individual with photosensitive epilepsy;
“send” includes transmit and publish (and related expressions are to be read accordingly).
(14) This section extends to England and Wales and Northern Ireland.’—(Paul Scully.)
This new clause creates (for England and Wales and Northern Ireland) a new offence of what is sometimes known as “epilepsy trolling” - sending or showing flashing images electronically to people with epilepsy intending to cause them harm.
Brought up, read the First and Second time, and added to the Bill.
New Clause 16
Communication offence for encouraging or assisting self-harm
‘(1) In the Suicide Act 1961, after section 3 insert—
“3A Communication offence for encouraging or assisting self-harm
(1) A person (“D”) commits an offence if—
(a) D sends a message,
(b) the message encourages or could be used to assist another person (“P”) to inflict serious physical harm upon themselves, and
(c) D’s act was intended to encourage or assist the infliction of serious physical harm.
(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, D.
(3) D may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.
(4) A person guilty of an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;
(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.
(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.
(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.
(7) If D arranges for a person (“D2”) to do an Act and D2 does that Act, D is also to be treated as having done that Act for the purposes of subsection (1).
(8) In proceedings for an offence to which this section applies, it shall be a defence for D to prove that—
(a) P had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from D; and
(b) P’s intention to inflict serious physical harm upon themselves was not initiated by D; and
(c) the message was wholly motivated by compassion towards D or to promote the interests of P’s health or wellbeing.”’—(Mr Davis.)
This new clause would create a new communication offence for sending a message encouraging or assisting another person to self-harm.
Brought up, and read the First time.
Question put, That the clause be read a Second time.