Online Safety Bill Debate
Full Debate: Read Full DebateAlex Davies-Jones
Main Page: Alex Davies-Jones (Labour - Pontypridd)Department Debates - View all Alex Davies-Jones's debates with the Department for Digital, Culture, Media & Sport
(1 year, 10 months ago)
Commons ChamberI beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following: ‘Section [Safety duties protecting adults and society: minimum standards for terms of service] Minimum standards for terms of service’ ‘Section [Harm to adults and society assessments] Harm to adults and society risk assessments Section [Adults and society online safety] Adults and society online safety’
New clause 2—Offence of failing to comply with a relevant duty—
‘(1) The provider of a service to whom a relevant duty applies commits an offence if the provider fails to comply with the duty.
(2) In the application of sections 178(2) and 179(5) to an offence under this section (where the offence has been committed with the consent or connivance of an officer of the entity or is attributable to any neglect on the part of an officer of the entity) the references in those provisions to an officer of an entity include references to any person who, at the time of the commission of the offence—
(a) was (within the meaning of section 93) a senior manager of the entity in relation to the activities of the entity in the course of which the offence was committed; or
(b) was a person purporting to act in such a capacity.
(3) A person who commits an offence under this section is liable on conviction on indictment to imprisonment for a term not exceeding two years or a fine (or both).
(4) In this section, “relevant duty” means a duty provided for by section 11 of this Act.’
This new clause makes it an offence for the provider of a user-to-service not to comply with the safety duties protecting children set out in clause 11. Where the offence is committed with the consent or connivance of a senior manager or other officer of the provider, or is attributable to their neglect, the officer, as well as the entity, is guilty of the offence.
New clause 3—Child user empowerment duties—
‘(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.
(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.
(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—
(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or
(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.
(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.
(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.
(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.
(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—
(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and
(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.
(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.
(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—
(a) non-verified users, or
(b) adult users, or
(c) any user other than those on a list approved by the child user.
(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—
(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and
(b) the size and capacity of the provider of a service.
(11) In this section “non-verified user” means a user who has not verified their identity to the provider of a service (see section 57(1)).
(12) In this section references to features include references to functionalities and settings.’
New clause 4—Safety duties protecting adults and society: minimum standards for terms of service—
‘(1) OFCOM may set minimum standards for the provisions included in a provider’s terms of service as far as they relate to the duties under sections 11, [Harm to adults and society risk assessment duties], [Safety duties protecting adults and society], 12, 16 to 19 and 28 of this Act (“relevant duties”).
(2) Where a provider does not meet the minimum standards, OFCOM may direct the provider to amend its terms of service in order to ensure that the standards are met.
(3) OFCOM must, at least once a year, conduct a review of—
(a) the extent to which providers are meeting the minimum standards, and
(b) how the providers’ terms of service are enabling them to fulfil the relevant duties.
(4) The report must assess whether any provider has made changes to its terms of service that might affect the way it fulfils a relevant duty.
(5) OFCOM must lay a report on the first review before both Houses of Parliament within one year of this Act being passed.
(6) OFCOM must lay a report on each subsequent review at least once a year thereafter.’
New clause 5—Harm to adult and society risk assessment duties—
‘(1) This section sets out the duties about risk assessments which apply in relation to Category 1 services (in addition to the duties about risk assessments set out in section 8 and, in the case of Category 1 services likely to be accessed by children, section 10).
(2) A duty to carry out a suitable and sufficient harm to adults and society risk assessment at a time set out in, or as provided by, Schedule 3.
(3) A duty to take appropriate steps to keep an harm to adults and society risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.
(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient harm to adults and society risk assessment relating to the impacts of that proposed change.
(5) A “harm to adults and society risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—
(a) the user base;
(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults and society (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;
(c) the level of risk of harm to adults and society presented by different kinds of priority content that is harmful to adults and society;
(d) the level of risk of harm to adults and society presented by priority content that is harmful to adults and society which particularly affects individuals with a certain characteristic or members of a certain group;
(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults and society, identifying and assessing those functionalities that present higher levels of risk;
(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults and society;
(g) the nature, and severity, of the harm that might be suffered by adults and society from the matters identified in accordance with paragraphs (b) to (f);
(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.
(6) In this section references to risk profiles are to the risk profiles for the time being published under section 85 which relate to the risk of harm to adults and society presented by priority content that is harmful to adults and society.
(7) See also—
(a) section 19(2) (records of risk assessments), and
(b) Schedule 3 (timing of providers’ assessments).’
New clause 6—Safety duties protecting adults and society—
‘(1) This section sets out the duties to prevent harms to adults and society which apply in relation to Category 1 services.
(2) A duty to summarise in the terms of service the findings of the most recent adults and society risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults and society).
(3) If a provider decides to treat a kind of priority content that is harmful to adults and society in a way described in subsection (4), a duty to include provisions in the terms of service specifying how that kind of content is to be treated (separately covering each kind of priority content that is harmful to adults and society which a provider decides to treat in one of those ways).
(4) These are the kinds of treatment of content referred to in subsection (3)—
(a) taking down the content;
(b) restricting users’ access to the content;
(c) limiting the recommendation or promotion of the content;
(d) recommending or promoting the content;
(e) allowing the content without treating it in a way described in any of paragraphs (a) to (d).
(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults and society (as identified in the most recent adults and society risk assessment of the service), by reference to—
(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and
(b) any other provisions of the terms of service designed to mitigate or manage those risks.
(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—
(a) are clear and accessible, and
(b) are applied consistently.
(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults and society present on the service, a duty to notify OFCOM of—
(a) the kinds of such content identified, and
(b) the incidence of those kinds of content on the service.
(8) In this section—
“harm to adults and society risk assessment” has the meaning given by section [harm to adults and society risk assessment duties];
“non-designated content that is harmful to adults and society” means content that is harmful to adults and society other than priority content that is harmful to adults and society.
(9) See also, in relation to duties set out in this section, section 18 (duties about freedom of expression and privacy).’
New clause 7—“Content that is harmful to adults and society” etc—
‘(1) This section applies for the purposes of this Part.
(2) “Priority content that is harmful to adults and society” means content of a description designated in regulations made by the Secretary of State as priority content that is harmful to adults and society.
(3) “Content that is harmful to adults and society” means—
(a) priority content that is harmful to adults and society, or
(b) content, not within paragraph (a), of a kind which presents a material risk of significant harm to an appreciable number of adults in the United Kingdom.
(4) For the purposes of this section—
(a) illegal content (see section 53) is not to be regarded as within subsection (3)(b), and
(b) content is not to be regarded as within subsection (3)(b) if the risk of harm flows from—
(i) the content’s potential financial impact,
(ii) the safety or quality of goods featured in the content, or
(iii) the way in which a service featured in the content may be performed (for example, in the case of the performance of a service by a person not qualified to perform it).
(5) References to “priority content that is harmful to adults and society” and “content that is harmful to adults and society” are to be read as—
(a) limited to content within the definition in question that is regulated user-generated content in relation to a regulated user-to-user service, and
(b) including material which, if it were present on a regulated user-to-user service, would be content within paragraph (a) (and this section is to be read with such modifications as may be necessary for the purpose of this paragraph).
(6) Sections 55 and 56 contain further provision about regulations made under this section.’
Government amendments 1 to 4.
Amendment 44, clause 11, page 10, line 17, , at end insert ‘, and—
“(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”’
Amendment 82, page 10, line 25, at end insert—
‘(3A) Content under subsection (3) includes content that may result in serious harm or death to a child while crossing the English Channel with the aim of entering the United Kingdom in a vessel unsuited or unsafe for those purposes.’
This amendment would require proportionate systems and processes, including removal of content, to be in place to control the access by young people to material which encourages them to undertake dangerous Channel crossings where their lives could be lost.
Amendment 83, page 10, line 25, at end insert—
‘(3A) Content promoting self-harm, including content promoting eating disorders, must be considered as harmful.’
Amendment 84, page 10, line 25, at end insert—
‘(3A) Content which advertises or promotes the practice of so-called conversion practices of LGBTQ+ individuals must be considered as harmful for the purposes of this section.’
Amendment 45, page 10, line 36, leave out paragraph (d) and insert—
‘(d) policies on user access to the service, parts of the service, or to particular content present on the service, including blocking users from accessing the service, parts of the service, or particular content,’.
Amendment 47, page 10, line 43, at end insert ‘, and
“(i) reducing or removing a user’s access to livestreaming features.”’
Amendment 46, page 10, line 43, at end insert ‘, and
“(i) reducing or removing a user’s access to private messaging features.”’
Amendment 48, page 11, line 25, after ‘accessible’ insert ‘for child users.’
Amendment 43, clause 12, page 12, line 24, leave out ‘made available to’ and insert
‘in operation by default for’.
Amendment 52, page 12, line 30, after ‘non-verified users’ insert
‘and to enable them to see whether another user is verified or non-verified.’
This amendment would require Category 1 services to make visible to users whether another user is verified or non-verified.
Amendment 49, page 12, line 30, at end insert—
‘(6A) A duty to ensure features and provisions in subsections (2), (4) and (6) are accessible and understandable to adult users with learning disabilities.’
Amendment 53, page 12, line 32, after ‘to’ insert ‘effectively’.
This amendment would bring this subsection into line with subsection (3) by requiring that the systems or processes available to users for the purposes described in subsections (7)(a) and (7)(b) should be effective.
Amendment 55, page 18, line 15, at end insert—
‘(4A) Content that is harmful to adults and society.’
Amendment 56, clause 17, page 20, line 10, leave out subsection (6) and insert—
‘(6) The following kinds of complaint are relevant for Category 1 services—
(a) complaints by users and affected persons about content present on a service which they consider to be content that is harmful to adults and society;
(b) complaints by users and affected persons if they consider that the provider is not complying with a duty set out in—
(i) section [adults and society online safety]
(ii) section 12 (user empowerment),
(iii) section 13 (content of democratic importance),
(iv) section 14 (news publisher content),
(v) section 15 (journalistic content), or
(vi) section 18(4), (6) or (7) (freedom of expression and privacy);
(c) complaints by a user who has generated, uploaded or shared content on a service if that content is taken down, or access to it is restricted, on the basis that it is content that is harmful to adults and society;
(d) complaints by a user of a service if the provider has given a warning to the user, suspended or banned the user from using the service, or in any other way restricted the user’s ability to use the service, as a result of content generated, uploaded or shared by the user which the provider considers to be content that is harmful to adults and society.’
Amendment 57, clause 19, page 21, line 40, leave out ‘or 10’ and insert
‘, 10 or [harms to adults and society risk assessment duties]’.
Amendment 58, page 22, line 37, at end insert—
‘(ba) section [adults and society online safety] (adults and society online safety),’
Government amendment 5.
Amendment 59, clause 44, page 44, line 11, at end insert
‘or
(ba) section [adults and society online safety] (adults and society online safety);’
Government amendment 6.
Amendment 60, clause 55, page 53, line 43, at end insert—
‘(2A) The Secretary of State may specify a description of content in regulations under section [“Content that is harmful to adult and society” etc](2) (priority content that is harmful to adults and society) only if the Secretary of State considers that, in relation to regulated user-to-user services, there is a material risk of significant harm to an appreciable number of adults presented by content of that description that is regulated user-generated content.’
Amendment 61, page 53, line 45, after ‘54’ insert
‘or [“Content that is harmful to adults and society” etc]’.
Amendment 62, page 54, line 8, after ‘54’ insert
‘or [“Content that is harmful to adults and society” etc]’.
Amendment 63, page 54, line 9, leave out ‘are to children’ and insert
‘or adults are to children or adults and society’.
Government amendments 7 to 16.
Amendment 77, clause 94, page 85, line 42, after ‘10’ insert
‘, [Adults and society risk assessment duties]’.
Amendment 78, page 85, line 44, at end insert—
‘(iiia) section [Adults and society online safety] (adults and society online safety);’
Amendment 54, clause 119, page 102, line 22, at end insert—
Amendment 79, page 102, line 22, at end insert—
Government amendments 17 to 19.
Amendment 51, clause 207, page 170, line 42, after ‘including’ insert ‘but not limited to’.
Government amendments 20 to 23.
Amendment 81, clause 211, page 177, line 3, leave out ‘and 55’ and insert
‘, [“Content that is harmful to adults and society” etc] and 55’.
Government amendments 24 to 42.
Amendment 64, schedule 8, page 207, line 13, leave out ‘relevant content’ and insert
‘priority content that is harmful to adults and society’.
Amendment 65, page 207, line 15, leave out ‘relevant content’ and insert
‘priority content that is harmful to adults and society’.
Amendment 66, page 207, line 17, leave out ‘relevant content’ and insert
‘priority content that is harmful to adults and society’.
Amendment 67, page 207, line 21, leave out ‘relevant content’ and insert
‘content that is harmful to adults and society, or other content which they consider breaches the terms of service.’
Amendment 68, page 207, line 23, leave out ‘relevant content’ and insert
‘priority content that is harmful to adults and society’.
Amendment 69, page 207, line 26, leave out ‘relevant content’ and insert
‘priority content that is harmful to adults and society’.
Amendment 70, page 208, line 2, leave out
‘or content that is harmful to children’
and insert
‘content that is harmful to children or priority content that is harmful to adults and society’.
Amendment 71, page 208, line 10, leave out
‘and content that is harmful to children’
and insert
‘content that is harmful to children and priority content that is harmful to adults and society’.
Amendment 72, page 208, line 13, leave out
“and content that is harmful to children”
and insert
‘content that is harmful to children and priority content that is harmful to adults and society’.
Amendment 73, page 210, line 2, at end insert
‘“content that is harmful to adults and society” and “priority content that is harmful to adults and society” have the same meaning as in section [“Content that is harmful to adults and society” etc]’.
Amendment 50, schedule 11, page 217, line 31, at end insert—
‘(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.’
Amendment 74, page 218, line 24, leave out
‘and content that is harmful to children’
and insert
‘content that is harmful to children and priority content that is harmful to adults and society’.
Amendment 75, page 219, line 6, leave out
‘and content that is harmful to children’
and insert
‘content that is harmful to children and priority content that is harmful to adults and society’.
Amendment 76, page 221, line 24, at end insert—
‘“priority content that is harmful to adults and society” has the same meaning as in section [“Content that is harmful to adults and society” etc]’.
Amendment 80, page 240, line 35, in schedule 17, at end insert—
‘(ba) section [Harm to adults and society assessments] (Harm to adults and society assessments), and’.
Once again, it is a privilege to be back in the Chamber opening this debate—the third Report stage debate in recent months—of this incredibly important and urgently needed piece of legislation. I speak on behalf of colleagues across the House when I say that the Bill is in a much worse position than when it was first introduced. It is therefore vital that it is now able to progress to the other place. Although we are all pleased to see the Bill return today, the Government’s delays have been incredibly costly and we still have a long way to go until we see meaningful change for the better.
In December, during the last Report stage debate, we had the immense privilege to be joined in the Public Gallery by a number of the families who have all lost children in connection with online harms. It is these families whom we must keep in our mind when we seek to get the Bill over the line once and for all. As ever, I pay tribute to their incredible efforts in the most difficult of all circumstances.
Today’s debate is also very timely in that, earlier today, the End Violence Against Women and Girls coalition and Glitch, a charity committed to ending online abuse, handed in their petition, which calls on the Prime Minister to protect women and girls online. The petition has amassed more than 90,000 signatures and rising, so we know there is strong support for improving internet safety across the board. I commend all those involved on their fantastic efforts in raising this important issue.
It would be remiss of me not to make a brief comment on the Government’s last-minute U-turns in their stance on criminal sanctions. The fact that we are seeing amendments withdrawn at the last minute goes to show that this Government have absolutely no idea where they truly stand on these issues and that they are ultimately too weak to stand up against vested interests, whereas Labour is on the side of the public and has consistently put safety at the forefront throughout the Bill’s passage.
More broadly, I made Labour’s feelings about the Government’s highly unusual decision to send part of this Bill back to Committee a second time very clear during the previous debate. I will spare colleagues by not repeating those frustrations here, but let me be clear: it is absolutely wrong that the Government chose to remove safety provisions relating to “legal but harmful” content in Committee. That is a major weakening, not strengthening, of the Bill; everyone online, including users and consumers, will be worse off without those provisions.
The Government’s alternative proposal, to introduce a toggle to filter out harmful content, is unworkable. Replacing the sections of this Bill that could have gone some way towards preventing harm with an emphasis on free speech instead undermines the very purpose of the Bill. It will embolden abusers, covid deniers, hoaxers and others, who will feel encouraged to thrive online.
In Committee, the Government also chose to remove important clauses from the Bill that were in place to keep adults safe online. Without the all-important risk assessments for adults, I must press the Minister on an important point: exactly how will this Bill do anything to keep adults safe online? The Government know all that, but have still pursued a course of action that will see the Bill watered down entirely.
Does my hon. Friend agree that, as we discussed in the Bill Committee, there is clear evidence that legal but harmful content is often the gateway to far more dangerous radicalisation and extremism, be it far-right, Islamist, incel or other? Will she therefore join me in supporting amendment 43 to ensure that by default such content is hidden from all adult users?
I completely support my hon. Friend’s comments and I was pleased to see her champion that cause in the Bill Committee. Of course I support amendment 43, tabled in the names of SNP colleagues, to ensure that the toggle is on by default. Abhorrent material is being shared and amplified—that is the key point, amplified—online by algorithms and by the processes and systems in place. It is obvious that the Government just do not get that. That said, there is a majority in Parliament and in the country for strengthening the Online Safety Bill, and Labour has been on the front foot in arguing for a stronger Bill since First Reading last year.
It is also important to recognise the sheer number of amendments and changes we have seen to the Bill so far. Even today, there are many more amendments tabled by the Government. If that does not give an indication of the mess they have made of getting this legislation over the line in a fit and proper state, I do not know what does.
I have said it before, and I am certain I will say it again, but we need to move forward with this Bill, not backward. That is why, despite significant Government delay, we will support the Bill’s Third Reading, as each day of inaction allows more harm to spread online. With that in mind, I too will make some progress.
I will first address new clause 1, tabled in my name and that of my hon. Friend the Member for Manchester Central (Lucy Powell). This important addition to the Bill will go some way to address the gaps around support for individual complaints. We in the Opposition have repeatedly queried Ministers and the Secretary of State on the mechanisms available for individuals who have appeals of complaints. That is why new clause 1 is so important. It is vital that platforms’ complaints procedures are fit for purpose, and this new clause will finally see the Secretary of State publishing a report on the options available to individuals.
We already know that the Bill in its current form fails to consider an appropriate avenue for individual complaints. This is a classic case of David and Goliath, and it is about time those platforms went further in giving their users a transparent, effective complaints process. That substantial lack of transparency underpins so many of the issues Labour has with the way the Government have handled—or should I say mishandled—the Bill so far, and it makes the process by which the Government proceeded to remove the all-important clauses on legal but harmful content, in a quiet room on Committee Corridor just before Christmas, even more frustrating.
That move put the entire Bill at risk. Important sections that would have put protections in place to prevent content such as health and foreign-state disinformation, the promotion of self-harm, and online abuse and harassment from being actively pushed and promoted were rapidly removed by the Government. That is not good enough, and it is why Labour has tabled a series of amendments, including new clauses 4, 5, 6 and 7, that we think would go some way towards correcting the Government’s extremely damaging approach.
Under the terms of the Bill as currently drafted, platforms could set whatever terms and conditions they want and change them at will. We saw that in Elon Musk’s takeover at Twitter, when he lifted the ban on covid disinformation overnight because of his own personal views. Our intention in tabling new clause 4 is to ensure that platforms are not able to simply avoid safety duties by changing their terms and conditions whenever they see fit. This group of amendments would give Ofcom the power to set minimum standards for platforms’ terms and conditions, and to direct platforms to change them if they do not meet those standards.
My hon. Friend is making an important point. She might not be aware of it, but I recently raised in the House the case of my constituents, whose 11-year-old daughter was groomed on the music streaming platform Spotify and was able to upload explicit photographs of herself on that platform. Thankfully, her parents found out and made several complaints to Spotify, which did not immediately remove that content. Is that not why we need the ombudsman?
I am aware of that case, which is truly appalling and shocking. That is exactly why we need such protections in the Bill: to stop those cases proliferating online, to stop the platforms from choosing their own terms of service, and to give Ofcom real teeth, as a regulator, to take on those challenges.
Does the hon. Lady accept that the Bill does give Ofcom the power to set minimum safety standards based on the priority legal offences written into the Bill? That would cover almost all the worst kinds of offences, including child sexual exploitation, inciting violence and racial hatred, and so on. Those are the minimum safety standards that are set, and the Bill guarantees them.
What is not in those minimum safety standards is all the horrendous and harmful content that I have described: covid disinformation, harmful content from state actors, self-harm promotion, antisemitism, misogyny and the incel culture, all of which is proliferating online and being amplified by the algorithms. This set of minimum safety standards can be changed overnight.
As the hon. Lady knows, foreign-state disinformation is covered because it is part of the priority offences listed in the National Security Bill, so those accounts can be disabled. Everything that meets the criminal threshold is in this Bill because it is in the National Security Bill, as she knows. The criminal threshold for all the offences she lists are set in schedule 7 of this Bill.
That is just the problem, though, isn’t it? A lot of those issues would not be covered by the minimum standards—that is why we have tabled new clause 4—because they do not currently meet the legal threshold. That is the problem. There is a grey area of incredibly harmful but legal content, which is proliferating online, being amplified by algorithms and by influencers—for want of a better word—and being fed to everybody online. That content is then shared incredibly widely, and that is what is causing harm and disinformation.
No, I will not. I need to make progress; we have a lot to cover and a lot of amendments, as I have outlined.
Under the terms of the Bill, platforms can issue whatever minimum standards they wish and then simply change them at will overnight. In tabling new clause 4, our intention is to ensure that the platforms are not able to avoid safety duties by changing their terms and conditions. As I have said, this group of amendments will give Ofcom the relevant teeth to act and keep everybody safe online.
We all recognise that there will be a learning curve for everyone involved once the legislation is enacted. We want to get that right, and the new clauses will ensure that platforms have specific duties to keep us safe. That is an important point, and I will continue to make it clear at every opportunity, because the platforms and providers have, for far too long, got away with zero regulation—nothing whatsoever—and enough is enough.
During the last Report stage, I made it clear that Labour considers individual liability essential to ensuring that online safety is taken seriously by online platforms. We have been calling for stronger criminal sanctions for months, and although we welcome some movement from the Government on that issue today, enforcement is now ultimately a narrower set of measures because the Government gutted much of the Bill before Christmas. That last minute U-turn is another one to add to a long list, but to be frank, very little surprises me when it comes to this Government’s approach to law-making.
I have to say to the hon. Lady that to describe it as a U-turn is not reasonable. The Government have interacted regularly with those who, like her, want to strengthen the Bill. There has been proper engagement and constructive conversation, and the Government have been persuaded by those who have made a similar case to the one she is making now. I think that warrants credit, rather than criticism.
I completely disagree with the right hon. Member, because we voted on this exact amendment before Christmas in the previous Report stage. It was tabled in the name of my right hon. Friend the Member for Barking (Dame Margaret Hodge), and it was turned down. It was word for word exactly the same amendment. If this is not anything but a U-turn, what is it?
I am pleased to support a number of important amendments in the names of the hon. Members for Aberdeen North (Kirsty Blackman) and for Ochil and South Perthshire (John Nicolson). In particular, I draw colleagues’ attention to new clause 3, which would improve the child empowerment duties in the Bill. The Government may think they are talking a good game on child safety, but it is clear to us all that some alarming gaps remain. The new clause would go some way to ensuring that the systems and processes behind platforms will go further in keeping children safe online.
In addition, we are pleased, as I have mentioned, to support amendment 43, which calls for the so-called safety toggle feature to be turned on by default. When the Government removed the clause relating to legal but harmful content in Committee, they instead introduced a requirement for platforms to give users the tools to reduce the likelihood of certain content appearing on their feeds. We have serious concerns about whether this approach is even workable, but if it is the route that the Government wish to take, we feel that these tools should at least be turned on by default.
Since my hon. Friend is on the point of safeguarding children, will she support Baroness Kidron as the Bill progresses to the other House in ensuring that coroners have access to data where they suspect that social media may have played a part in the death of children?
I can confirm that we will be supporting Baroness Kidron in her efforts. We will support a number of amendments that will be tabled in the Lords in the hope of strengthening this Bill further, because we have reached the limit of what we can do in this place. I commend the work that Baroness Kidron and the 5Rights Foundation have been doing to support children and to make this Bill work to keep everybody online as safe as possible.
Supporting amendment 43 would send a strong signal that our Government want to put online safety at the forefront of all our experiences when using the internet. For that reason, I look forward to the Minister seriously considering this amendment going forward. Scottish National party colleagues can be assured of our support, as I have previously outlined, should there be a vote on that.
More broadly, I highlight the series of amendments tabled in my name and that of my hon. Friend the Member for Manchester Central that ultimately aim to reverse out of the damaging avenue that the Government have chosen to go to down in regulating so-called legal but harmful content. As I have already mentioned, the Government haphazardly chose to remove those important clauses in Committee. They have chopped and changed this Bill more times than any of us can remember, and we are now left with a piece of legislation that is even more difficult to follow and, importantly, implement than when it was first introduced. We can all recognise that there is a huge amount of work to be done in making the Bill fit for purpose. Labour has repeatedly worked to make meaningful improvements at every opportunity, and it will be on the Government’s hands if the Bill is subject to even more delay. The Minister knows that, and I sincerely hope that he will take these concerns seriously. After all, if he will not listen to me, he would do well to listen to the mounting concerns raised by Members on his own Benches instead.
I do not have time, but I thank all Members who contributed to today’s debate. I pay tribute to my officials and to all the Ministers who have worked on this Bill over such a long time.
I beg to ask leave to withdraw the clause.
Clause, by leave, withdrawn.