[Relevant documents: Second Report of the Petitions Committee of Session 2021-22, Tackling Online Abuse, HC 766, and the Government response, HC 1224; Letter from the Chair of the Women and Equalities Committee to the Minister for Tech and the Digital Economy regarding Pornography and its impact on VAWG, dated 13 June 2022; Letter from the Minister for Tech and the Digital Economy to the Chair of the Women and Equalities Committee regarding Pornography and its impact on VAWG, dated 30 August 2022; e-petition 272087, Hold online trolls accountable for their online abuse via their IP address; e-petition 332315, Ban anonymous accounts on social media; e-petition 575833, Make verified ID a requirement for opening a social media account; e-petition 582423, Repeal Section 127 of the Communications Act 2003 and expunge all convictions; e-petition 601932, Do not restrict our right to freedom of expression online.]
Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - -

Before we open the debate, I want to make a brief comment about the scope of today’s debate. Today’s debate on consideration follows the re-committal of the Bill to a Public Bill Committee in December last year. We are therefore debating today only the new clauses and amendments listed on the selection paper issued today. These are either: new clauses relating to the re-committed clauses and schedules; amendments to those clauses and schedules; or amendments to other parts of the Bill consequential on changes made to the Bill on re-committal in the Public Bill Committee.

On 5 December, the House finished its consideration on report of other parts of the Bill. The scope of today’s report stage generally does not include those parts of the Bill that were not re-committed. The exception is where amendments on the selection paper are consequential to the changes made to re-committed clauses, and relate to clauses that were not re-committed. Should there be time for debate on Third Reading, it is of course permissible to speak then to any of the content of the Bill.

I should also remind the House that, because of the time taken for the emergency debate, proceedings on consideration are now scheduled to finish at 8.13 pm and proceedings on Third Reading at 9.13 pm.

New Clause 1

Report on redress for individual complaints

‘(1) The Secretary of State must publish a report assessing options for dealing with appeals about complaints made under section 17 of this Act.

(2) The report must—

(a) provide a general update on the fulfilment of duties about complaints procedures which apply in relation to all regulated user-to-user services;

(b) assess which body should be responsible for a system to deal with appeals in cases where a complainant considers that a complaint has not been satisfactorily dealt with; and

(c) provide options for how the system should be funded, including consideration of whether an annual surcharge could be imposed on user-to-user services.

(3) The report must be laid before Parliament within six months of the commencement of section 17.’—(Alex Davies- Jones.)

Brought up, and read the First time.

Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- View Speech - Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Rosie Winterton Portrait Madam Deputy Speaker
- Hansard - -

With this it will be convenient to discuss the following:

New clause 2—Offence of failing to comply with a relevant duty

‘(1) The provider of a service to whom a relevant duty applies commits an offence if the provider fails to comply with the duty.

(2) In the application of sections 178(2) and 179(5) to an offence under this section (where the offence has been committed with the consent or connivance of an officer of the entity or is attributable to any neglect on the part of an officer of the entity) the references in those provisions to an officer of an entity include references to any person who, at the time of the commission of the offence—

(a) was (within the meaning of section 93) a senior manager of the entity in relation to the activities of the entity in the course of which the offence was committed; or

(b) was a person purporting to act in such a capacity.

(3) A person who commits an offence under this section is liable on conviction on indictment to imprisonment for a term not exceeding two years or a fine (or both).

(4) In this section, “relevant duty” means a duty provided for by section 11 of this Act.’

This new clause makes it an offence for the provider of a user-to-service not to comply with the safety duties protecting children set out in clause 11. Where the offence is committed with the consent or connivance of a senior manager or other officer of the provider, or is attributable to their neglect, the officer, as well as the entity, is guilty of the offence.

New clause 3—Child user empowerment duties

‘(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.

(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.

(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or

(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.

(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.

(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.

(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.

(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—

(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and

(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.

(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.

(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—

(a) non-verified users, or

(b) adult users, or

(c) any user other than those on a list approved by the child user.

(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—

(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and

(b) the size and capacity of the provider of a service.

(11) In this section “non-verified user” means a user who has not verified their identity to the provider of a service (see section 57(1)).

(12) In this section references to features include references to functionalities and settings.’

New clause 4—Safety duties protecting adults and society: minimum standards for terms of service

‘(1) OFCOM may set minimum standards for the provisions included in a provider’s terms of service as far as they relate to the duties under sections 11, [Harm to adults and society risk assessment duties], [Safety duties protecting adults and society], 12, 16 to 19 and 28 of this Act (“relevant duties”).

(2) Where a provider does not meet the minimum standards, OFCOM may direct the provider to amend its terms of service in order to ensure that the standards are met.

(3) OFCOM must, at least once a year, conduct a review of—

(a) the extent to which providers are meeting the minimum standards, and

(b) how the providers’ terms of service are enabling them to fulfil the relevant duties.

(4) The report must assess whether any provider has made changes to its terms of service that might affect the way it fulfils a relevant duty.

(5) OFCOM must lay a report on the first review before both Houses of Parliament within one year of this Act being passed.

(6) OFCOM must lay a report on each subsequent review at least once a year thereafter.’

New clause 5—Harm to adult and society risk assessment duties

‘(1) This section sets out the duties about risk assessments which apply in relation to Category 1 services (in addition to the duties about risk assessments set out in section 8 and, in the case of Category 1 services likely to be accessed by children, section 10).

(2) A duty to carry out a suitable and sufficient harm to adults and society risk assessment at a time set out in, or as provided by, Schedule 3.

(3) A duty to take appropriate steps to keep an harm to adults and society risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient harm to adults and society risk assessment relating to the impacts of that proposed change.

(5) A “harm to adults and society risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults and society (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(c) the level of risk of harm to adults and society presented by different kinds of priority content that is harmful to adults and society;

(d) the level of risk of harm to adults and society presented by priority content that is harmful to adults and society which particularly affects individuals with a certain characteristic or members of a certain group;

(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults and society, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults and society;

(g) the nature, and severity, of the harm that might be suffered by adults and society from the matters identified in accordance with paragraphs (b) to (f);

(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(6) In this section references to risk profiles are to the risk profiles for the time being published under section 85 which relate to the risk of harm to adults and society presented by priority content that is harmful to adults and society.

(7) See also—

(a) section 19(2) (records of risk assessments), and

(b) Schedule 3 (timing of providers’ assessments).’

New clause 6—Safety duties protecting adults and society

‘(1) This section sets out the duties to prevent harms to adults and society which apply in relation to Category 1 services.

(2) A duty to summarise in the terms of service the findings of the most recent adults and society risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults and society).

(3) If a provider decides to treat a kind of priority content that is harmful to adults and society in a way described in subsection (4), a duty to include provisions in the terms of service specifying how that kind of content is to be treated (separately covering each kind of priority content that is harmful to adults and society which a provider decides to treat in one of those ways).

(4) These are the kinds of treatment of content referred to in subsection (3)—

(a) taking down the content;

(b) restricting users’ access to the content;

(c) limiting the recommendation or promotion of the content;

(d) recommending or promoting the content;

(e) allowing the content without treating it in a way described in any of paragraphs (a) to (d).

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults and society (as identified in the most recent adults and society risk assessment of the service), by reference to—

(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently.

(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults and society present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(8) In this section—

“harm to adults and society risk assessment” has the meaning given by section [harm to adults and society risk assessment duties];

“non-designated content that is harmful to adults and society” means content that is harmful to adults and society other than priority content that is harmful to adults and society.

(9) See also, in relation to duties set out in this section, section 18 (duties about freedom of expression and privacy).’

New clause 7—“Content that is harmful to adults and society” etc

‘(1) This section applies for the purposes of this Part.

(2) “Priority content that is harmful to adults and society” means content of a description designated in regulations made by the Secretary of State as priority content that is harmful to adults and society.

(3) “Content that is harmful to adults and society” means—

(a) priority content that is harmful to adults and society, or

(b) content, not within paragraph (a), of a kind which presents a material risk of significant harm to an appreciable number of adults in the United Kingdom.

(4) For the purposes of this section—

(a) illegal content (see section 53) is not to be regarded as within subsection (3)(b), and

(b) content is not to be regarded as within subsection (3)(b) if the risk of harm flows from—

(i) the content’s potential financial impact,

(ii) the safety or quality of goods featured in the content, or

(iii) the way in which a service featured in the content may be performed (for example, in the case of the performance of a service by a person not qualified to perform it).

(5) References to “priority content that is harmful to adults and society” and “content that is harmful to adults and society” are to be read as—

(a) limited to content within the definition in question that is regulated user-generated content in relation to a regulated user-to-user service, and

(b) including material which, if it were present on a regulated user-to-user service, would be content within paragraph (a) (and this section is to be read with such modifications as may be necessary for the purpose of this paragraph).

(6) Sections 55 and 56 contain further provision about regulations made under this section.’

Government amendments 1 to 4.

Amendment 44, clause 11, page 10, line 17, , at end insert ‘, and—

“(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”’

Amendment 82, page 10, line 25, at end insert—

‘(3A) Content under subsection (3) includes content that may result in serious harm or death to a child while crossing the English Channel with the aim of entering the United Kingdom in a vessel unsuited or unsafe for those purposes.’

This amendment would require proportionate systems and processes, including removal of content, to be in place to control the access by young people to material which encourages them to undertake dangerous Channel crossings where their lives could be lost.

Amendment 83, page 10, line 25, at end insert—

‘(3A) Content promoting self-harm, including content promoting eating disorders, must be considered as harmful.’

Amendment 84, page 10, line 25, at end insert—

‘(3A) Content which advertises or promotes the practice of so-called conversion practices of LGBTQ+ individuals must be considered as harmful for the purposes of this section.’

Amendment 45, page 10, line 36, leave out paragraph (d) and insert—

‘(d) policies on user access to the service, parts of the service, or to particular content present on the service, including blocking users from accessing the service, parts of the service, or particular content,’.

Amendment 47, page 10, line 43, at end insert ‘, and

“(i) reducing or removing a user’s access to livestreaming features.”’

Amendment 46, page 10, line 43, at end insert ‘, and

“(i) reducing or removing a user’s access to private messaging features.”’

Amendment 48, page 11, line 25, after ‘accessible’ insert ‘for child users.’

Amendment 43, clause 12, page 12, line 24, leave out ‘made available to’ and insert

‘in operation by default for’.

Amendment 52, page 12, line 30, after ‘non-verified users’ insert

‘and to enable them to see whether another user is verified or non-verified.’

This amendment would require Category 1 services to make visible to users whether another user is verified or non-verified.

Amendment 49, page 12, line 30, at end insert—

‘(6A) A duty to ensure features and provisions in subsections (2), (4) and (6) are accessible and understandable to adult users with learning disabilities.’

Amendment 53, page 12, line 32, after ‘to’ insert ‘effectively’.

This amendment would bring this subsection into line with subsection (3) by requiring that the systems or processes available to users for the purposes described in subsections (7)(a) and (7)(b) should be effective.

Amendment 55, page 18, line 15, at end insert—

‘(4A) Content that is harmful to adults and society.’

Amendment 56, clause 17, page 20, line 10, leave out subsection (6) and insert—

‘(6) The following kinds of complaint are relevant for Category 1 services—

(a) complaints by users and affected persons about content present on a service which they consider to be content that is harmful to adults and society;

(b) complaints by users and affected persons if they consider that the provider is not complying with a duty set out in—

(i) section [adults and society online safety]

(ii) section 12 (user empowerment),

(iii) section 13 (content of democratic importance),

(iv) section 14 (news publisher content),

(v) section 15 (journalistic content), or

(vi) section 18(4), (6) or (7) (freedom of expression and privacy);

(c) complaints by a user who has generated, uploaded or shared content on a service if that content is taken down, or access to it is restricted, on the basis that it is content that is harmful to adults and society;

(d) complaints by a user of a service if the provider has given a warning to the user, suspended or banned the user from using the service, or in any other way restricted the user’s ability to use the service, as a result of content generated, uploaded or shared by the user which the provider considers to be content that is harmful to adults and society.’

Amendment 57, clause 19, page 21, line 40, leave out ‘or 10’ and insert

‘, 10 or [harms to adults and society risk assessment duties]’.

Amendment 58, page 22, line 37, at end insert—

‘(ba) section [adults and society online safety] (adults and society online safety),’

Government amendment 5.

Amendment 59, clause 44, page 44, line 11, at end insert

‘or

(ba) section [adults and society online safety] (adults and society online safety);’

Government amendment 6.

Amendment 60, clause 55, page 53, line 43, at end insert—

‘(2A) The Secretary of State may specify a description of content in regulations under section [“Content that is harmful to adult and society” etc](2) (priority content that is harmful to adults and society) only if the Secretary of State considers that, in relation to regulated user-to-user services, there is a material risk of significant harm to an appreciable number of adults presented by content of that description that is regulated user-generated content.’

Amendment 61, page 53, line 45, after ‘54’ insert

‘or [“Content that is harmful to adults and society” etc]’.

Amendment 62, page 54, line 8, after ‘54’ insert

‘or [“Content that is harmful to adults and society” etc]’.

Amendment 63, page 54, line 9, leave out ‘are to children’ and insert

‘or adults are to children or adults and society’.

Government amendments 7 to 16.

Amendment 77, clause 94, page 85, line 42, after ‘10’ insert

‘, [Adults and society risk assessment duties]’.

Amendment 78, page 85, line 44, at end insert—

‘(iiia) section [Adults and society online safety] (adults and society online safety);’

Amendment 54, clause 119, page 102, line 22, at end insert—

‘Section [Safety duties protecting adults and society: minimum standards for terms of service]

Minimum standards for terms of service’



Amendment 79, page 102, line 22, at end insert—

‘Section [Harm to adults and society assessments]

Harm to adults and society risk assessments

Section [Adults and society online safety]

Adults and society online safety’



Government amendments 17 to 19.

Amendment 51, clause 207, page 170, line 42, after ‘including’ insert ‘but not limited to’.

Government amendments 20 to 23.

Amendment 81, clause 211, page 177, line 3, leave out ‘and 55’ and insert

‘, [“Content that is harmful to adults and society” etc] and 55’.

Government amendments 24 to 42.

Amendment 64, schedule 8, page 207, line 13, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 65, page 207, line 15, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 66, page 207, line 17, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 67, page 207, line 21, leave out ‘relevant content’ and insert

‘content that is harmful to adults and society, or other content which they consider breaches the terms of service.’

Amendment 68, page 207, line 23, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 69, page 207, line 26, leave out ‘relevant content’ and insert

‘priority content that is harmful to adults and society’.

Amendment 70, page 208, line 2, leave out

‘or content that is harmful to children’

and insert

‘content that is harmful to children or priority content that is harmful to adults and society’.

Amendment 71, page 208, line 10, leave out

‘and content that is harmful to children’

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 72, page 208, line 13, leave out

“and content that is harmful to children”

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 73, page 210, line 2, at end insert

‘“content that is harmful to adults and society” and “priority content that is harmful to adults and society” have the same meaning as in section [“Content that is harmful to adults and society” etc]’.

Amendment 50, schedule 11, page 217, line 31, at end insert—

‘(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.’

Amendment 74, page 218, line 24, leave out

‘and content that is harmful to children’

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 75, page 219, line 6, leave out

‘and content that is harmful to children’

and insert

‘content that is harmful to children and priority content that is harmful to adults and society’.

Amendment 76, page 221, line 24, at end insert—

‘“priority content that is harmful to adults and society” has the same meaning as in section [“Content that is harmful to adults and society” etc]’.

Amendment 80, page 240, line 35, in schedule 17, at end insert—

‘(ba) section [Harm to adults and society assessments] (Harm to adults and society assessments), and’.

--- Later in debate ---
None Portrait Several hon. Members rose—
- Hansard -

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - -

I have noticed that some people are standing who may not have applied earlier. If anybody is aware of that, can they let me know, and I can adjust timings accordingly? At the moment, my estimate is that if everybody takes no longer than seven minutes, and perhaps more like six, we can get everybody in comfortably without having to impose a time limit.

--- Later in debate ---
None Portrait Several hon. Members rose—
- Hansard -

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - -

I remind hon. Members about the six-minute advisory time limit.

Caroline Dinenage Portrait Dame Caroline Dinenage (Gosport) (Con)
- View Speech - Hansard - - - Excerpts

It is a great relief to see the Online Safety Bill finally reach this stage. It seems like a long time since my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) kicked it off with the ambitious aim of making the UK the safest place in the world to be online. Although other countries around the world had picked at the edges of it, we were truly the first country in the world to set out comprehensive online safety legislation. Since then, other jurisdictions have started and, in some cases, concluded this work. As one of the relay of Ministers who have carried this particular baton of legislation on its very long journey, I know we are tantalising close to getting to the finish line. That is why we need to focus on that today, and I am really grateful to the hon. Member for Pontypridd (Alex Davies-Jones) for confirming that the Opposition are going to support the Bill on Third Reading.

--- Later in debate ---
Caroline Dinenage Portrait Dame Caroline Dinenage
- Hansard - - - Excerpts

My hon. Friend is absolutely right to raise this, because we do need the Bill to be future-proofed to deal with some of the recently emerging threats to women and others that the online world has offered.

The potential threat of online harms is everyday life for most children in the modern world. Before Christmas, I received an email from my son’s school highlighting a TikTok challenge encouraging children to strangle each other until they passed out. This challenge probably did not start on TikTok, and it certainly is not exclusive to the platform, but when my children were born I never envisaged a day when I would have to sit them down and warn them about the potential dangers of allowing someone else to throttle them until they passed out. It is terrifying. Our children need this legislation.

I welcome the Government support for amendment 84 to clause 11, in the name of my hon. Friend the Member for Rutland and Melton (Alicia Kearns), to ban content that advertises so-called conversion therapies for LGBTQ+ people. Someone’s sexuality and who they love is not something to be cured, and unscrupulous crooks should not be able to profit from pushing young people towards potentially sinister and harmful treatments.

I really sympathise with the aims behind new clause 2, on senior executive liability. It is vital that this regime has the teeth to protect children and hold companies to account. I know the 10% of annual global turnover maximum fine is higher than some of the global comparisons, and certainly having clear personal consequences for those responsible for enforcing the law is an incentive for them to do it properly, but there is clearly a balance to strike. We must make sure that sanctions are proportionate and targeted, and do not make the UK a less attractive place to build a digital business. I am really pleased to hear Ministers’ commitment to a final amendment that will strike that really important balance.

I am concerned about the removal of measures on legal but harmful content. I understand the complexity of defining them, but other measures, including the so-called triple shield, do not offer the same protections for vulnerable adults or avoid the cliff edge when someone reaches the age of 18. That particularly concerns me for adults with special educational needs or disabilities. The key point here is that, if the tragic cases of Molly Russell and dozens of young people like her teach us anything, it is that dreadful, harmful online content cannot be defined strictly by what is illegal, because algorithms do not differentiate between harmful and harmless content. They see a pattern and they exploit it.

We often talk about the parallels between the online and offline world—we say that what is illegal online should be illegal offline, and vice versa—but in reality the two worlds are fundamentally different. In the real world, for a young person struggling with an eating disorder or at risk of radicalisation, their inner demons are not reinforced by everyone they meet on the street, but algorithms are echo chambers. They take our fears and our paranoia, and they surround us with unhealthy voices that normalise and validate them, however dangerous and however hateful, glamorising eating disorders, accelerating extremist, racist and antisemitic views and encouraging violent misogyny on incel sites.

That is why I worry that the opt-out option suggested in the Bill simply does not offer enough protection: the lines between what is legal and illegal are too opaque. Sadly, it feels as though this part of the Bill has become the lightning rod for those who think it will result in an overly censorious approach. However, we are where we are. As the Molly Rose Foundation said, the swift implementation of the Bill must now be the priority. Time is no longer on our side, and while we perfect this vast, complicated and inherently imperfect legislation, the most unspeakable content is allowed to proliferate in the online world every single day.

Finally, I put on record the exhaustive efforts made by the incredible team at the Department for Digital, Culture, Media and Sport and the Home Office, who brought this Bill to fruition. If there was ever an example of not letting the perfect be the enemy of the good, this is it, and right now we need to get this done. The stakes in human terms simply could not be any higher.

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - -

I call the SNP spokesperson, Kirsty Blackman.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - - - Excerpts

I congratulate the hon. Member for Gosport (Dame Caroline Dinenage) on what was one of the best speeches on this Bill—and we have heard quite a lot. It was excellent and very thoughtful. I will speak to a number of amendments. I will not cover the Labour amendments in any detail because, as ever, the Labour Front Benchers did an excellent job of that. The right hon. Member for Barking (Dame Margaret Hodge) covered nicely the amendment on liability, and brought up the issue of hate, particularly when pointed towards the Jewish community. I thank her for consistently bringing that up. It is important to hear her voice and others on this issue.

Amendment 43 was tabled by me and my hon. Friend the Member for Ochil and South Perthshire (John Nicolson) and it regards a default toggle for material that we all agree is unsafe or harmful. The Labour party has said that it agrees with the amendment, and the SNP believes that the safest option should be the default option. We should start from a point of view that if anybody wants to see eating disorder content, or racist or incredibly harmful content that does not meet the bar of illegality, they should have to opt in to receive it. They should not see it by default; they should have to make that choice to see such content.

Freedom of speech is written into the Bill. People can say whatever they want as long as it is below that bar of illegality, but we should not have to read it. We should not have to read abuse that is pointed toward minority groups. We should start from the position of having the safest option on. We are trying to improve the permissive approach that the Government have arrived at, and this simple change is not controversial. It would require users to flip a switch if they want to opt in to some of the worst and most dangerous content available online, including pro-suicide, pro-anorexia or pro-bulimia content, rather than leaving that switch on by default.

If the Government want the terms and conditions to be the place where things are excluded or included, I think platforms should have to say, “We are happy to have pro-bulimia or pro-anorexia content.” They should have to make that clear and explicit in their terms of service, rather than having to say, “We do not allow x, y and z.” They should have to be clear, up front and honest with people, because then people would know what they are signing up to when they sign up to a website.

Amendment 44 is on habit forming features, and we have not spoken enough about the habit forming nature of social media in particular. Sites such as TikTok, Instagram and Facebook are set up to encourage people to spend time on them. They make money by encouraging people to spend as much time on them as possible—that is the intention behind them. We know that 42% of respondents to a survey by YoungMinds reported displaying signs of addiction-like behaviour when questioned about their social media habits. Young people are worried about that, and they do not necessarily have the tools to avoid it. We therefore tabled amendment 44 to take that into account, and to require platforms to consider that important issue.

New clause 3, on child user empowerment, was mentioned earlier. There is a bizarre loophole in the Bill requiring user empowerment toggles for adults but not for children. It is really odd not to require them for children when we know that they will be able to see some of this content and access features that are much more inherently dangerous to them than to adults. That is why we tabled amendments on private messaging features and live streaming features.

Live streaming is a place where self-generated child sexual abuse has shot through the roof. With child user empowerment, children would have to opt in, and they would have empowerment tools to allow them opportunities to say, “No, I don’t want to be involved in live streaming,” or to allow their parents to say, “No, I don’t want my child to be able to do live streaming when they sign up to Instagram. I don’t want them able to share live photos and to speak to people they don’t know.” Amendment 46, on private messaging features, would allow children to say, “No, I don’t want to get any private messages from anyone I don’t know.” That is not written into terms of service or in the Bill as a potentially harmful thing, but children should be able to exclude themselves from having such conversations.

We have been talking about the relationship between real life and the online world. If a child is playing in a play park and some stranger comes up and talks to them, the child is perfectly within their rights to say, “No, I’m not speaking to strangers. My parents have told me that, and it is a good idea not to speak to strangers,” but they cannot do that in the online world. We are asking for that to be taken into account and for platforms to allow private messaging and live streaming features to be switched off for certain groups of people. If they were switched off for children under 13, that would make Roblox, for example, a far safer place than it currently is.

I turn to amendment 84, on conversion therapy. I am glad that the amendment was tabled and that there are moves by the UK Government to bring forward the conversion therapy ban. As far as I am aware—I have been in the Chamber all day—we have not yet seen that legislation, but I am told that it will be coming. I pay tribute to all those who have worked really hard to get us to the position where the Government have agreed to bring forward a Bill. They are to be commended on that. I am sorry that it has taken this long, but I am glad that we are in that position. The amendment was massively helpful in that.

Lastly, I turn to amendment 50, on the risk of harm. One of the biggest remaining issues with the Bill is about the categorisation of platforms, which is done on the basis of their size and the risk of their features. The size of the platform—the number of users on it—is the key thing, but that fails to take into account very small and incredibly harmful platforms. The amendment would give Ofcom the power to categorise platforms that are incredibly harmful—incel forums, for example, and Kiwi Farms, set up entirely to dox trans people and put their lives at risk—as category 1 platforms and require them to meet all the rules, risk assessments and things for those platforms.

We should be asking those platforms to answer for what they are doing, no matter how few members they have or how small their user base. One person being radicalised on such a platform is one person too many. Amendment 50 is not an extreme amendment saying that we should ban all those platforms, although we probably should. It would ask Ofcom to have a higher bar for them and require them to do more.

I cannot believe that we are here again and that the Bill has taken so long to get to this point. I agree that the Bill is far from perfect, but it is better than nothing. The SNP will therefore not be voting against its Third Reading, because it is marginally better than the situation that we have right now.

--- Later in debate ---
None Portrait Several hon. Members rose—
- Hansard -

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - -

Order. Things are not going quite according to plan, so colleagues might perhaps like to gear more towards five minutes as we move forward.

Luke Pollard Portrait Luke Pollard (Plymouth, Sutton and Devonport) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

I rise to speak in favour of new clause 4, on minimum standards. In particular, I shall restrict my remarks to minimum standards in respect of incel culture.

Colleagues will know of the tragedy that took place in Plymouth in 2021. Indeed, the former Home Secretary, the right hon. Member for Witham (Priti Patel), visited Plymouth to meet and have discussions with the people involved. I really want to rid the internet of the disgusting, festering incel culture that is capturing so many of our young people, especially young men. In particular, I want minimum standards to apply and to make sure that, on big and small platforms where there is a risk, those minimum standards include the recognition of incel content. At the moment, incel content is festering in the darkest corners of the internet, where young men are taught to channel their frustrations into an insidious hatred of women and to think of themselves as brothers in arms in a war against women. It is that serious.

In Parliament this morning I convened a group of expert stakeholders, including those from the Centre for Countering Digital Hate, Tech Against Terrorism, Moonshot, Girlguiding, the Antisemitism Policy Trust and the Internet Watch Foundation, to discuss the dangers of incel culture. I believe that incel culture is a growing threat online, with real-world consequences. Incels are targeting young men, young people and children to swell their numbers. Andrew Tate may not necessarily be an incel, but his type of hate and division is growing and is very popular online. He is not the only one, and the model of social media distribution that my right hon. Friend the Member for Barking (Dame Margaret Hodge) spoke about incentivises hate to be viewed, shared and indulged in.

This Bill does not remove incel content online and therefore does not prevent future tragedies. As chair of the all-party parliamentary group on social media, I want to see minimum standards to raise the internet out of the sewer. Where is the compulsion for online giants such as Facebook and YouTube to remove incel content? Five of the most popular incel channels on YouTube have racked up 140,000 subscribers and 24 million views between them, and YouTube is still platforming four of those five. Why? How can these channels apparently pass YouTube’s current terms and conditions? The content is truly harrowing. In these YouTube videos, men who have murdered women are described as saints and lauded in incel culture.

We know that incels use mainstream platforms such as YouTube to reel in unsuspecting young men—so-called normies—before linking them to their own small, specialist websites that show incel content. This is called breadcrumbing: driving traffic and audiences from mainstream platforms to smaller platforms—which will be outside the scope of category 1 provisions and therefore any minimum standards—where individuals start their journey to incel radicalisation.

I think we need to talk less about freedom of speech and more about freedom of reach. We need to talk about enabling fewer and fewer people to see that content, and about down-ranking sites with appalling content like this to increase the friction to reduce audience reach. Incel content not only includes sexist and misogynist material; it also frequently includes anti-Semitic, racist, homophobic and transphobic items layered on top of one another. However, without a “legal but harmful” provision, the Bill does nothing to force search engines to downrate harmful content. If it is to be online, it needs to be harder and harder to find.

I do not believe that a toggle will be enough to deal with this. I agree with amendment 43—if we are to have a toggle, the default should be the norm—but I do not think a toggle will work because it will be possible to evade it with a simple Google Chrome extension that will auto-toggle and therefore make it almost redundant immediately. It will be a minor inconvenience, not a game changer. Some young men spent 10 hours a day looking at violent incel content online. Do we really think that a simple button, a General Data Protection Regulation annoyance button, will stop them from doing so? It will not, and it will not prevent future tragedies.

However, this is not just about the effect on other people; it is also about the increase in the number of suicides. One of the four largest incel forums is dedicated to suicide and self-harm. Suicide is normalised in the forum, and is often referred to as “catching the bus.” People get together to share practical advice on how they can take their own lives. That is not content to which we should be exposing our young people, but it is currently legal. It is harmful, but it will remain legal under the Bill because the terms and conditions of those sites are written by incels to promote incel content. Even if the sites were moved from category 2 to category 1, they would still pass the tests in the Bill, because the incels have written the terms and conditions to allow that content.

Why are smaller platforms not included in the Bill? Ofcom should have the power to bring category 2 sites into scope on the basis of risk. Analysis conducted by the Center for Countering Digital Hate shows that on the largest incel website, rape is mentioned in posts every 29 minutes, with 89% of those posts referring to it in a positive sense. Moreover, 50% of users’ posts about child abuse on the same site are supportive of paedophilia. Indeed, the largest incel forum has recently changed its terms and conditions to allow mention of the sexualisation of pubescent minors—unlike pre-pubescent minors; it makes that distinction. This is disgusting and wrong, so why is it not covered in the Bill? I think there is a real opportunity to look at incel content, and I would be grateful if the Minister met the cross-party group again to discuss how we can ensure that it is harder and harder to find online and is ultimately removed, so that we can protect all our young people from going down this path.

--- Later in debate ---
None Portrait Several hon. Members rose—
- Hansard -

Rosie Winterton Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - -

In order to ensure that we get everybody in, I am going to introduce a five-minute time limit. I call Richard Burgon.

Richard Burgon Portrait Richard Burgon (Leeds East) (Lab)
- View Speech - Hansard - - - Excerpts

I have listened with interest to all the powerful speeches that have been made today. As legislation moves through Parliament, it is meant to be improved, but the great pity with this Bill is that it has got worse, not better. It is a real tragedy that measures protecting adults from harmful but legal content have been watered down.

I rise to speak against the amendments that have come from the Government, including amendments 11 to 14 and 18 and 19, which relate to the removal of adult safety duties. I am also speaking in favour of new clause 4 from the Labour Front Bench team and amendment 43 from the SNP, which go at least some of the way to protect adults from harmful but legal content.