Read Bill Ministerial Extracts
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Winterton of Doncaster
Main Page: Baroness Winterton of Doncaster (Labour - Life peer)Department Debates - View all Baroness Winterton of Doncaster's debates with the Department for Digital, Culture, Media & Sport
(2 years ago)
Commons ChamberMy right hon. Friend is absolutely right. I am keen to bring this back into scope before Mr Speaker chastises us any further, but she is right to say that this will have a direct real-world impact. This is what happens when we focus on content rather than directly on the platforms and the algorithms on the platforms proliferating this content. That is where the focus needs to be. It is the algorithms that share and amplify this content to these many followers time and again that need to be tackled, rather than the content itself. That is what we have been pleading with the Government to concentrate on, but here we are in this mess.
We are pleased that the Government have taken on board Labour’s policy to criminalise certain behaviours—including the encouragement of self-harm, sharing people’s intimate images without their consent, and controlling or coercive behaviours—but we believe that the communications offences more widely should remain in order to tackle dangerous online harms at their root. We have worked consistently to get this Bill over the line and we have reached out to do so. It has been subject to far too many delays and it is on the Government’s hands that we are again facing substantial delays, when internet regulation has never been more sorely needed. I know that the Minister knows that, and I sincerely hope he will take our concerns seriously. I reach out to him again across the Dispatch Box, and look forward to working with him and challenging him further where required as the Bill progresses. I look forward to getting the Bill on to the statute book.
I welcome the Under-Secretary of State for Digital, Culture, Media and Sport, my hon. Friend the Member for Sutton and Cheam (Paul Scully), to his place. To say that he has been given a hospital pass in terms of this legislation is a slight understatement. It is very difficult to understand, and the ability he has shown at the Dispatch Box in grasping many of the major issues is to his credit. He really is a safe pair of hands and I thank him for that.
Looking at the list of amendments, I think it is a bit of a hotchpotch, yet we are going to deal only with certain amendments today and others are not in scope. That shows exactly where we are with this legislation. We have been in this stasis now for five years. I remember that we were dealing with the issue when I joined the Digital, Culture, Media and Sport Committee, and it is almost three years since the general election when we said we would bring forward this world-leading legislation. We have to admit that is a failure of the political class in all respects, but we have to understand the problem and the realities facing my hon. Friend, other Ministers and the people from different Departments involved in drafting this legislation.
We are dealing with companies that are more powerful than the oil barons and railway barons of the 19th century. These companies are more important than many states. The total value of Alphabet, for instance, is more than the total GDP of the Netherlands, and that is probably a low estimate of Alphabet’s global reach and power. These companies are, in many respects, almost new nation states in their power and reach, and they have been brought about by individuals having an idea in their garage. They still have that culture of having power without the consequences that flow from it.
I am grateful to have the opportunity to speak in this debate. I commend the right hon. Member for Basingstoke (Dame Maria Miller) on her work in this important area. I would like to focus my remarks on legal but harmful content and its relationship to knife crime, and to mention a very harrowing and difficult constituency case of mine. As we have heard, legal but harmful content can have a truly dreadful effect. I pay tribute to the families of the children who have been lost, who have attended the debate, a number of whom are still in the Public Gallery.
Just to be clear, the hon. Gentleman’s speech must relate to the amendments before us today.
Thank you, Madam Deputy Speaker. A boy called Olly Stephens in my constituency was just 13 years old when he was stabbed and brutally murdered in an attack linked to online bullying. He died, sadly, very near his home. His parents had little idea of the social media activity in his life. It is impossible to imagine what they have been through. Our hearts go out to them.
Harmful but legal content had a terrible effect on the attack on Olly. The two boys who attacked and stabbed him had been sharing enormous numbers of pictures and videos of knives, repeatedly, over a long period of time. There were often videos of teenagers playing with knives, waving them or holding them. They circulated them on 11 different social media platforms over a long period of time. None of those platforms took any action to take the content down. We all need to learn more about such cases to fully understand the impact of legal but harmful content. Even at this late stage, I hope that the Government will think again about the changes they have made to the Bill and include this area again in the Bill.
There is a second aspect of this very difficult case that I want to mention: the fact that Olly’s murder was discussed on social media and was planned to some extent beforehand. The wider issues here underline the need for far greater regulation and moderation of social media, in particular teenagers’ use of these powerful sites. I am finding it difficult to talk about some of these matters, but I hope that the Government will take my points on board and address the issue of legal but harmful content, and that the Minister will think again about these important matters. Perhaps we will have an opportunity to discuss it in the Bill’s later stages.
Order. Just a quick reminder: I know it is extremely difficult, and I do not want to interrupt hon. Members when they are making their speeches, but it is important that we try to address the amendments that are before us today. There will be a separate debate on whether to recommit the Bill and on the other ideas, so they can be addressed at that point. As I say, it is important to relate remarks to the amendments that are before us.
I apologise for having left the debate for a short time; I had committed to speaking to a room full of young people about the importance of political education, which felt like the right thing to do, given the nature of the debate and the impact that the Bill will have on our young people.
I am extremely relieved that we are continuing to debate the Bill, despite the considerable delays that we have seen; as I mentioned in this House previously, it is long overdue. I acknowledge that it is still groundbreaking in its scope and extremely important, but we must now ensure that it works, particularly for children and vulnerable adults, and that it goes some way to cleaning up the internet for everyone by putting users first and holding platforms to account.
On new clause 53, I put on record my thanks to the Government for following through with their commitments to me in Committee to write Zach’s law in full into the Bill. My constituent Zach Eagling and his mum Clare came into Parliament a few weeks ago, and I know that hon. Members from both sides of the House were pleased to meet him to thank him for his incredible campaign to make the vile practice of epilepsy trolling completely illegal, with a maximum penalty of a five-year prison sentence. The inspirational Zach, his mum and the Epilepsy Society deserve enormous praise and credit for their incredible campaign, which will now protect the 600,000 people living with epilepsy in the UK. I am delighted to report that Zach and his mum have texted me to thank all hon. Members for their work on that.
I will raise three areas of particular concern with the parts of the Bill that we are focusing on. First, on director liability, the Bill includes stiff financial penalties for platforms that I hope will force them to comply with these regulations, but until the directors of these companies are liable and accountable for ensuring that their platforms comply and treat the subject with the seriousness it requires, I do not believe that we will see the action needed to protect children and all internet users.
Ultimately, if platforms enforce their own terms and conditions, remove illegal content and comply with the legal but harmful regulations—as they consistently tell us that they will—they have nothing to worry about. When we hear the stories of harm committed online, however, and when we hear from the victims and their families about the devastation that it causes, we must be absolutely watertight in ensuring that those who manage and operate the platforms take every possible step to protect every user on their platform.
We must ensure that, to the directors of those companies, this is a personal commitment as part of their role and responsibility. As we saw with health and safety regulations, direct liability is the most effective way to ensure that companies implement such measures and are scrupulous in reviewing them. That is why I support new clause 17 and thank my right hon. Friend the Member for Barking (Dame Margaret Hodge) for her tireless and invaluable work on this subject.
Let me turn to media literacy—a subject that I raised repeatedly in Committee. I am deeply disappointed that the Government have removed the media literacy duty that they previously committed to introducing. Platforms can boast of all the safety tools they have to protect users, talk about them in meetings, publicise them in press releases and defend them during Committee hearings, but unless users know that they are there and know exactly how to use them, and unless they are being used, their existence is pointless.
It is a pleasure to speak in the debate. I thank Members who have spoken thus far for their comments. I commend the right hon. Member for Chelmsford (Vicky Ford) for what she referred to in relation to eating disorders. At this time, we are very aware of that pertinent issue: the impact that social media has—the social pressure and the peer pressure—on those who feel they are too fat when they are not, or that they are carrying weight when they are not. That is part of what the Bill tries to address. I thank the Minister for his very constructive comments—he is always constructive—and for laying out where we are. Some of us perhaps have concerns that the Bill does not go far enough. I know I am one of them and maybe Minister, you might be of the same mind yourself—
The Minister might be of the same mind himself.
Through speaking in these debates, my office has seen an increase in correspondence from parents who are thankful that these difficult issues are being talked about. The world is changing and progressing, and if we are going to live in a world where we want to protect our children and our grandchildren—I have six grandchildren —and all other grandchildren who are involved in social media, the least we can do is make sure they are safe.
I commend the hon. Member for Batley and Spen (Kim Leadbeater) and others, including the hon. Member for Watford (Dean Russell), who have spoken about Zach’s law. We are all greatly impressed that we have that in the Bill through constructive lobbying. New clause 28, which the hon. Member for Rotherham (Sarah Champion) referred to, relates to advocacy for young people. That is an interesting idea, but I feel that advocacy should be for the parents first and not necessarily young people.
Ahead of the debate, I was in contact with the Royal College of Psychiatrists. It published a report entitled “Technology use and the mental health of children and young people”—new clause 16 is related to that—which was an overview of research into the use of screen time and social media by children and young teenagers. It has been concluded that excessive use of phones and social media by a young person is detrimental to their development and mental health—as we all know and as Members have spoken about—and furthermore that online abuse and bullying has become more prevalent because of that. The right hon. Member for Witham (Priti Patel) referred to those who are susceptible to online harm. We meet them every day, and parents tell me that our concerns are real.
A recent report by NHS Digital found that one in eight 11 to 16-year-olds reported that they had been bullied online. When parents contact me, they say that bulling online is a key issue for them, and the statistics come from those who choose to be honest and talk about it. Although the Government’s role is to create a Bill that enables protection for our children, there is also an incredible role for schools, which can address bullying. My hon. Friend the Member for Upper Bann (Carla Lockhart) and I talked about some of the young people we know at school who have been bullied online. Schools have stepped in and stopped that, encouraging and protecting children, and they can play that role as well.
We have all read of the story of Molly Russell, who was only 14 years old when she took her life. Nobody in this House or outside it could not have been moved by her story. Her father stated that he strongly believed that the images, videos and information that she was able to access through Instagram played a crucial part in her life being cut short. The Bill must complete its passage and focus on strengthening protections online for children. Ultimately, the responsibility is on large social media companies to ensure that harmful information is removed, but the Bill puts the onus on us to hold social media firms to account and to ensure that they do so.
Harmful and dangerous content for children comes in many forms—namely, online abuse and exposure to self-harm and suicidal images. In addition, any inappropriate or sexual content has the potential to put children and young people at severe risk. The Bill is set to put provisions in place to protect victims in the sharing of nude or intimate photos. That is increasingly important for young people, who are potentially being groomed online and do not understand the full extent of what they are doing and the risks that come with that. Amendments have been tabled to ensure that, should such cases of photo sharing go to court, provisions are in place to ensure complete anonymity for the victims—for example, through video links in court, and so on.
I commend the right hon. Member for Basingstoke (Dame Maria Miller), who is not in her place, for her hard work in bringing forward new clause 48. Northern Ireland, along with England and Wales, will benefit from new clause 53, and I welcome the ability to hand down sentences of between six months and potentially five years.
Almost a quarter of girls who have taken a naked image have had their image sent to someone else online without their permission. Girls face very distinct and increased risks on social media, with more than four in five online grooming crimes targeting girls, and 97% of child abuse material featuring the sexual abuse of girls—wow, we really need to do something to protect our children and to give parents hope. There needs to be increased emphasis and focus on making children’s use of the internet safer by design. Once established, all platforms and services need to have the capacity and capability to respond to emerging patterns of sexual abuse, which often stem from photo sharing.
The Minister referred to terrorism and how terrorism can be promoted online. I intervened on him to mention the glorification of IRA terrorism and how that encourages further acts of terrorism and people who are susceptible to be involved. I am quite encouraged by the Minister’s response, and I think that we need to take a significant step. Some in Northern Ireland, for instance, try to rewrite history and use the glorification of terrorism for that purpose. We would like to see strengthening of measures to ensure that those involved in those acts across Northern Ireland are controlled.
In conclusion, there are many aspects of the Bill that I can speak in support of in relation to the benefits of securing digital protections for those on social media. This is, of course, about protecting not just children, but all of us from the dangers of social media. I have chosen to speak on these issues as they are often raised by constituents. There are serious matters regarding the glorification and encouragement of self-harm that the Bill needs to address. We have heard stories tonight that are difficult to listen to, because they are true stories from people we know, and we have heard horror stories about intimate photo sharing online. I hope that action on those issues, along with the many others that the Government are addressing, will be embedded in the Bill with the intent to finally ensure that we have regulations and protection for all people, especially our children—I think of my children and grandchildren, and like everybody else, my constituents.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Winterton of Doncaster
Main Page: Baroness Winterton of Doncaster (Labour - Life peer)Department Debates - View all Baroness Winterton of Doncaster's debates with the Department for Digital, Culture, Media & Sport
(1 year, 11 months ago)
Commons ChamberBefore we open the debate, I want to make a brief comment about the scope of today’s debate. Today’s debate on consideration follows the re-committal of the Bill to a Public Bill Committee in December last year. We are therefore debating today only the new clauses and amendments listed on the selection paper issued today. These are either: new clauses relating to the re-committed clauses and schedules; amendments to those clauses and schedules; or amendments to other parts of the Bill consequential on changes made to the Bill on re-committal in the Public Bill Committee.
On 5 December, the House finished its consideration on report of other parts of the Bill. The scope of today’s report stage generally does not include those parts of the Bill that were not re-committed. The exception is where amendments on the selection paper are consequential to the changes made to re-committed clauses, and relate to clauses that were not re-committed. Should there be time for debate on Third Reading, it is of course permissible to speak then to any of the content of the Bill.
I should also remind the House that, because of the time taken for the emergency debate, proceedings on consideration are now scheduled to finish at 8.13 pm and proceedings on Third Reading at 9.13 pm.
New Clause 1
Report on redress for individual complaints
‘(1) The Secretary of State must publish a report assessing options for dealing with appeals about complaints made under section 17 of this Act.
(2) The report must—
(a) provide a general update on the fulfilment of duties about complaints procedures which apply in relation to all regulated user-to-user services;
(b) assess which body should be responsible for a system to deal with appeals in cases where a complainant considers that a complaint has not been satisfactorily dealt with; and
(c) provide options for how the system should be funded, including consideration of whether an annual surcharge could be imposed on user-to-user services.
(3) The report must be laid before Parliament within six months of the commencement of section 17.’—(Alex Davies- Jones.)
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following: ‘Section [Safety duties protecting adults and society: minimum standards for terms of service] Minimum standards for terms of service’ ‘Section [Harm to adults and society assessments] Harm to adults and society risk assessments Section [Adults and society online safety] Adults and society online safety’
New clause 2—Offence of failing to comply with a relevant duty—
‘(1) The provider of a service to whom a relevant duty applies commits an offence if the provider fails to comply with the duty.
(2) In the application of sections 178(2) and 179(5) to an offence under this section (where the offence has been committed with the consent or connivance of an officer of the entity or is attributable to any neglect on the part of an officer of the entity) the references in those provisions to an officer of an entity include references to any person who, at the time of the commission of the offence—
(a) was (within the meaning of section 93) a senior manager of the entity in relation to the activities of the entity in the course of which the offence was committed; or
(b) was a person purporting to act in such a capacity.
(3) A person who commits an offence under this section is liable on conviction on indictment to imprisonment for a term not exceeding two years or a fine (or both).
(4) In this section, “relevant duty” means a duty provided for by section 11 of this Act.’
This new clause makes it an offence for the provider of a user-to-service not to comply with the safety duties protecting children set out in clause 11. Where the offence is committed with the consent or connivance of a senior manager or other officer of the provider, or is attributable to their neglect, the officer, as well as the entity, is guilty of the offence.
New clause 3—Child user empowerment duties—
‘(1) This section sets out the duties to empower child users which apply in relation to Category 1 services.
(2) A duty to include in a service, to the extent that it is proportionate to do so, features which child users may use or apply if they wish to increase their control over harmful content.
(3) The features referred to in subsection (2) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—
(a) reduce the likelihood of the user encountering priority content that is harmful, or particular kinds of such content, by means of the service, or
(b) alert the user to the harmful nature of priority content that is harmful that the user may encounter by means of the service.
(4) A duty to ensure that all features included in a service in compliance with the duty set out in subsection (2) are made available to all child users.
(5) A duty to include clear and accessible provisions in the terms of service specifying which features are offered in compliance with the duty set out in subsection (2), and how users may take advantage of them.
(6) A duty to include in a service features which child users may use or apply if they wish to filter out non-verified users.
(7) The features referred to in subsection (6) are those which, if used or applied by a user, result in the use by the service of systems or processes designed to—
(a) prevent non-verified users from interacting with content which that user generates, uploads or shares on the service, and
(b) reduce the likelihood of that user encountering content which non-verified users generate, upload or share on the service.
(8) A duty to include in a service features which child users may use or apply if they wish to only encounter content by users they have approved.
(9) A duty to include in a service features which child users may use or apply if they wish to filter out private messages from—
(a) non-verified users, or
(b) adult users, or
(c) any user other than those on a list approved by the child user.
(10) In determining what is proportionate for the purposes of subsection (2), the following factors, in particular, are relevant—
(a) all the findings of the most recent child risk assessment (including as to levels of risk and as to nature, and severity, of potential harm), and
(b) the size and capacity of the provider of a service.
(11) In this section “non-verified user” means a user who has not verified their identity to the provider of a service (see section 57(1)).
(12) In this section references to features include references to functionalities and settings.’
New clause 4—Safety duties protecting adults and society: minimum standards for terms of service—
‘(1) OFCOM may set minimum standards for the provisions included in a provider’s terms of service as far as they relate to the duties under sections 11, [Harm to adults and society risk assessment duties], [Safety duties protecting adults and society], 12, 16 to 19 and 28 of this Act (“relevant duties”).
(2) Where a provider does not meet the minimum standards, OFCOM may direct the provider to amend its terms of service in order to ensure that the standards are met.
(3) OFCOM must, at least once a year, conduct a review of—
(a) the extent to which providers are meeting the minimum standards, and
(b) how the providers’ terms of service are enabling them to fulfil the relevant duties.
(4) The report must assess whether any provider has made changes to its terms of service that might affect the way it fulfils a relevant duty.
(5) OFCOM must lay a report on the first review before both Houses of Parliament within one year of this Act being passed.
(6) OFCOM must lay a report on each subsequent review at least once a year thereafter.’
New clause 5—Harm to adult and society risk assessment duties—
‘(1) This section sets out the duties about risk assessments which apply in relation to Category 1 services (in addition to the duties about risk assessments set out in section 8 and, in the case of Category 1 services likely to be accessed by children, section 10).
(2) A duty to carry out a suitable and sufficient harm to adults and society risk assessment at a time set out in, or as provided by, Schedule 3.
(3) A duty to take appropriate steps to keep an harm to adults and society risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.
(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient harm to adults and society risk assessment relating to the impacts of that proposed change.
(5) A “harm to adults and society risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—
(a) the user base;
(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults and society (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;
(c) the level of risk of harm to adults and society presented by different kinds of priority content that is harmful to adults and society;
(d) the level of risk of harm to adults and society presented by priority content that is harmful to adults and society which particularly affects individuals with a certain characteristic or members of a certain group;
(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults and society, identifying and assessing those functionalities that present higher levels of risk;
(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults and society;
(g) the nature, and severity, of the harm that might be suffered by adults and society from the matters identified in accordance with paragraphs (b) to (f);
(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.
(6) In this section references to risk profiles are to the risk profiles for the time being published under section 85 which relate to the risk of harm to adults and society presented by priority content that is harmful to adults and society.
(7) See also—
(a) section 19(2) (records of risk assessments), and
(b) Schedule 3 (timing of providers’ assessments).’
New clause 6—Safety duties protecting adults and society—
‘(1) This section sets out the duties to prevent harms to adults and society which apply in relation to Category 1 services.
(2) A duty to summarise in the terms of service the findings of the most recent adults and society risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults and society).
(3) If a provider decides to treat a kind of priority content that is harmful to adults and society in a way described in subsection (4), a duty to include provisions in the terms of service specifying how that kind of content is to be treated (separately covering each kind of priority content that is harmful to adults and society which a provider decides to treat in one of those ways).
(4) These are the kinds of treatment of content referred to in subsection (3)—
(a) taking down the content;
(b) restricting users’ access to the content;
(c) limiting the recommendation or promotion of the content;
(d) recommending or promoting the content;
(e) allowing the content without treating it in a way described in any of paragraphs (a) to (d).
(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults and society (as identified in the most recent adults and society risk assessment of the service), by reference to—
(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and
(b) any other provisions of the terms of service designed to mitigate or manage those risks.
(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—
(a) are clear and accessible, and
(b) are applied consistently.
(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults and society present on the service, a duty to notify OFCOM of—
(a) the kinds of such content identified, and
(b) the incidence of those kinds of content on the service.
(8) In this section—
“harm to adults and society risk assessment” has the meaning given by section [harm to adults and society risk assessment duties];
“non-designated content that is harmful to adults and society” means content that is harmful to adults and society other than priority content that is harmful to adults and society.
(9) See also, in relation to duties set out in this section, section 18 (duties about freedom of expression and privacy).’
New clause 7—“Content that is harmful to adults and society” etc—
‘(1) This section applies for the purposes of this Part.
(2) “Priority content that is harmful to adults and society” means content of a description designated in regulations made by the Secretary of State as priority content that is harmful to adults and society.
(3) “Content that is harmful to adults and society” means—
(a) priority content that is harmful to adults and society, or
(b) content, not within paragraph (a), of a kind which presents a material risk of significant harm to an appreciable number of adults in the United Kingdom.
(4) For the purposes of this section—
(a) illegal content (see section 53) is not to be regarded as within subsection (3)(b), and
(b) content is not to be regarded as within subsection (3)(b) if the risk of harm flows from—
(i) the content’s potential financial impact,
(ii) the safety or quality of goods featured in the content, or
(iii) the way in which a service featured in the content may be performed (for example, in the case of the performance of a service by a person not qualified to perform it).
(5) References to “priority content that is harmful to adults and society” and “content that is harmful to adults and society” are to be read as—
(a) limited to content within the definition in question that is regulated user-generated content in relation to a regulated user-to-user service, and
(b) including material which, if it were present on a regulated user-to-user service, would be content within paragraph (a) (and this section is to be read with such modifications as may be necessary for the purpose of this paragraph).
(6) Sections 55 and 56 contain further provision about regulations made under this section.’
Government amendments 1 to 4.
Amendment 44, clause 11, page 10, line 17, , at end insert ‘, and—
“(c) mitigate the harm to children caused by habit-forming features of the service by consideration and analysis of how processes (including algorithmic serving of content, the display of other users’ approval of posts and notifications) contribute to development of habit-forming behaviour.”’
Amendment 82, page 10, line 25, at end insert—
‘(3A) Content under subsection (3) includes content that may result in serious harm or death to a child while crossing the English Channel with the aim of entering the United Kingdom in a vessel unsuited or unsafe for those purposes.’
This amendment would require proportionate systems and processes, including removal of content, to be in place to control the access by young people to material which encourages them to undertake dangerous Channel crossings where their lives could be lost.
Amendment 83, page 10, line 25, at end insert—
‘(3A) Content promoting self-harm, including content promoting eating disorders, must be considered as harmful.’
Amendment 84, page 10, line 25, at end insert—
‘(3A) Content which advertises or promotes the practice of so-called conversion practices of LGBTQ+ individuals must be considered as harmful for the purposes of this section.’
Amendment 45, page 10, line 36, leave out paragraph (d) and insert—
‘(d) policies on user access to the service, parts of the service, or to particular content present on the service, including blocking users from accessing the service, parts of the service, or particular content,’.
Amendment 47, page 10, line 43, at end insert ‘, and
“(i) reducing or removing a user’s access to livestreaming features.”’
Amendment 46, page 10, line 43, at end insert ‘, and
“(i) reducing or removing a user’s access to private messaging features.”’
Amendment 48, page 11, line 25, after ‘accessible’ insert ‘for child users.’
Amendment 43, clause 12, page 12, line 24, leave out ‘made available to’ and insert
‘in operation by default for’.
Amendment 52, page 12, line 30, after ‘non-verified users’ insert
‘and to enable them to see whether another user is verified or non-verified.’
This amendment would require Category 1 services to make visible to users whether another user is verified or non-verified.
Amendment 49, page 12, line 30, at end insert—
‘(6A) A duty to ensure features and provisions in subsections (2), (4) and (6) are accessible and understandable to adult users with learning disabilities.’
Amendment 53, page 12, line 32, after ‘to’ insert ‘effectively’.
This amendment would bring this subsection into line with subsection (3) by requiring that the systems or processes available to users for the purposes described in subsections (7)(a) and (7)(b) should be effective.
Amendment 55, page 18, line 15, at end insert—
‘(4A) Content that is harmful to adults and society.’
Amendment 56, clause 17, page 20, line 10, leave out subsection (6) and insert—
‘(6) The following kinds of complaint are relevant for Category 1 services—
(a) complaints by users and affected persons about content present on a service which they consider to be content that is harmful to adults and society;
(b) complaints by users and affected persons if they consider that the provider is not complying with a duty set out in—
(i) section [adults and society online safety]
(ii) section 12 (user empowerment),
(iii) section 13 (content of democratic importance),
(iv) section 14 (news publisher content),
(v) section 15 (journalistic content), or
(vi) section 18(4), (6) or (7) (freedom of expression and privacy);
(c) complaints by a user who has generated, uploaded or shared content on a service if that content is taken down, or access to it is restricted, on the basis that it is content that is harmful to adults and society;
(d) complaints by a user of a service if the provider has given a warning to the user, suspended or banned the user from using the service, or in any other way restricted the user’s ability to use the service, as a result of content generated, uploaded or shared by the user which the provider considers to be content that is harmful to adults and society.’
Amendment 57, clause 19, page 21, line 40, leave out ‘or 10’ and insert
‘, 10 or [harms to adults and society risk assessment duties]’.
Amendment 58, page 22, line 37, at end insert—
‘(ba) section [adults and society online safety] (adults and society online safety),’
Government amendment 5.
Amendment 59, clause 44, page 44, line 11, at end insert
‘or
(ba) section [adults and society online safety] (adults and society online safety);’
Government amendment 6.
Amendment 60, clause 55, page 53, line 43, at end insert—
‘(2A) The Secretary of State may specify a description of content in regulations under section [“Content that is harmful to adult and society” etc](2) (priority content that is harmful to adults and society) only if the Secretary of State considers that, in relation to regulated user-to-user services, there is a material risk of significant harm to an appreciable number of adults presented by content of that description that is regulated user-generated content.’
Amendment 61, page 53, line 45, after ‘54’ insert
‘or [“Content that is harmful to adults and society” etc]’.
Amendment 62, page 54, line 8, after ‘54’ insert
‘or [“Content that is harmful to adults and society” etc]’.
Amendment 63, page 54, line 9, leave out ‘are to children’ and insert
‘or adults are to children or adults and society’.
Government amendments 7 to 16.
Amendment 77, clause 94, page 85, line 42, after ‘10’ insert
‘, [Adults and society risk assessment duties]’.
Amendment 78, page 85, line 44, at end insert—
‘(iiia) section [Adults and society online safety] (adults and society online safety);’
Amendment 54, clause 119, page 102, line 22, at end insert—
Amendment 79, page 102, line 22, at end insert—
Government amendments 17 to 19.
Amendment 51, clause 207, page 170, line 42, after ‘including’ insert ‘but not limited to’.
Government amendments 20 to 23.
Amendment 81, clause 211, page 177, line 3, leave out ‘and 55’ and insert
‘, [“Content that is harmful to adults and society” etc] and 55’.
Government amendments 24 to 42.
Amendment 64, schedule 8, page 207, line 13, leave out ‘relevant content’ and insert
‘priority content that is harmful to adults and society’.
Amendment 65, page 207, line 15, leave out ‘relevant content’ and insert
‘priority content that is harmful to adults and society’.
Amendment 66, page 207, line 17, leave out ‘relevant content’ and insert
‘priority content that is harmful to adults and society’.
Amendment 67, page 207, line 21, leave out ‘relevant content’ and insert
‘content that is harmful to adults and society, or other content which they consider breaches the terms of service.’
Amendment 68, page 207, line 23, leave out ‘relevant content’ and insert
‘priority content that is harmful to adults and society’.
Amendment 69, page 207, line 26, leave out ‘relevant content’ and insert
‘priority content that is harmful to adults and society’.
Amendment 70, page 208, line 2, leave out
‘or content that is harmful to children’
and insert
‘content that is harmful to children or priority content that is harmful to adults and society’.
Amendment 71, page 208, line 10, leave out
‘and content that is harmful to children’
and insert
‘content that is harmful to children and priority content that is harmful to adults and society’.
Amendment 72, page 208, line 13, leave out
“and content that is harmful to children”
and insert
‘content that is harmful to children and priority content that is harmful to adults and society’.
Amendment 73, page 210, line 2, at end insert
‘“content that is harmful to adults and society” and “priority content that is harmful to adults and society” have the same meaning as in section [“Content that is harmful to adults and society” etc]’.
Amendment 50, schedule 11, page 217, line 31, at end insert—
‘(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.’
Amendment 74, page 218, line 24, leave out
‘and content that is harmful to children’
and insert
‘content that is harmful to children and priority content that is harmful to adults and society’.
Amendment 75, page 219, line 6, leave out
‘and content that is harmful to children’
and insert
‘content that is harmful to children and priority content that is harmful to adults and society’.
Amendment 76, page 221, line 24, at end insert—
‘“priority content that is harmful to adults and society” has the same meaning as in section [“Content that is harmful to adults and society” etc]’.
Amendment 80, page 240, line 35, in schedule 17, at end insert—
‘(ba) section [Harm to adults and society assessments] (Harm to adults and society assessments), and’.
I have noticed that some people are standing who may not have applied earlier. If anybody is aware of that, can they let me know, and I can adjust timings accordingly? At the moment, my estimate is that if everybody takes no longer than seven minutes, and perhaps more like six, we can get everybody in comfortably without having to impose a time limit.
I remind hon. Members about the six-minute advisory time limit.
It is a great relief to see the Online Safety Bill finally reach this stage. It seems like a long time since my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) kicked it off with the ambitious aim of making the UK the safest place in the world to be online. Although other countries around the world had picked at the edges of it, we were truly the first country in the world to set out comprehensive online safety legislation. Since then, other jurisdictions have started and, in some cases, concluded this work. As one of the relay of Ministers who have carried this particular baton of legislation on its very long journey, I know we are tantalising close to getting to the finish line. That is why we need to focus on that today, and I am really grateful to the hon. Member for Pontypridd (Alex Davies-Jones) for confirming that the Opposition are going to support the Bill on Third Reading.
My hon. Friend is absolutely right to raise this, because we do need the Bill to be future-proofed to deal with some of the recently emerging threats to women and others that the online world has offered.
The potential threat of online harms is everyday life for most children in the modern world. Before Christmas, I received an email from my son’s school highlighting a TikTok challenge encouraging children to strangle each other until they passed out. This challenge probably did not start on TikTok, and it certainly is not exclusive to the platform, but when my children were born I never envisaged a day when I would have to sit them down and warn them about the potential dangers of allowing someone else to throttle them until they passed out. It is terrifying. Our children need this legislation.
I welcome the Government support for amendment 84 to clause 11, in the name of my hon. Friend the Member for Rutland and Melton (Alicia Kearns), to ban content that advertises so-called conversion therapies for LGBTQ+ people. Someone’s sexuality and who they love is not something to be cured, and unscrupulous crooks should not be able to profit from pushing young people towards potentially sinister and harmful treatments.
I really sympathise with the aims behind new clause 2, on senior executive liability. It is vital that this regime has the teeth to protect children and hold companies to account. I know the 10% of annual global turnover maximum fine is higher than some of the global comparisons, and certainly having clear personal consequences for those responsible for enforcing the law is an incentive for them to do it properly, but there is clearly a balance to strike. We must make sure that sanctions are proportionate and targeted, and do not make the UK a less attractive place to build a digital business. I am really pleased to hear Ministers’ commitment to a final amendment that will strike that really important balance.
I am concerned about the removal of measures on legal but harmful content. I understand the complexity of defining them, but other measures, including the so-called triple shield, do not offer the same protections for vulnerable adults or avoid the cliff edge when someone reaches the age of 18. That particularly concerns me for adults with special educational needs or disabilities. The key point here is that, if the tragic cases of Molly Russell and dozens of young people like her teach us anything, it is that dreadful, harmful online content cannot be defined strictly by what is illegal, because algorithms do not differentiate between harmful and harmless content. They see a pattern and they exploit it.
We often talk about the parallels between the online and offline world—we say that what is illegal online should be illegal offline, and vice versa—but in reality the two worlds are fundamentally different. In the real world, for a young person struggling with an eating disorder or at risk of radicalisation, their inner demons are not reinforced by everyone they meet on the street, but algorithms are echo chambers. They take our fears and our paranoia, and they surround us with unhealthy voices that normalise and validate them, however dangerous and however hateful, glamorising eating disorders, accelerating extremist, racist and antisemitic views and encouraging violent misogyny on incel sites.
That is why I worry that the opt-out option suggested in the Bill simply does not offer enough protection: the lines between what is legal and illegal are too opaque. Sadly, it feels as though this part of the Bill has become the lightning rod for those who think it will result in an overly censorious approach. However, we are where we are. As the Molly Rose Foundation said, the swift implementation of the Bill must now be the priority. Time is no longer on our side, and while we perfect this vast, complicated and inherently imperfect legislation, the most unspeakable content is allowed to proliferate in the online world every single day.
Finally, I put on record the exhaustive efforts made by the incredible team at the Department for Digital, Culture, Media and Sport and the Home Office, who brought this Bill to fruition. If there was ever an example of not letting the perfect be the enemy of the good, this is it, and right now we need to get this done. The stakes in human terms simply could not be any higher.
I call the SNP spokesperson, Kirsty Blackman.
I congratulate the hon. Member for Gosport (Dame Caroline Dinenage) on what was one of the best speeches on this Bill—and we have heard quite a lot. It was excellent and very thoughtful. I will speak to a number of amendments. I will not cover the Labour amendments in any detail because, as ever, the Labour Front Benchers did an excellent job of that. The right hon. Member for Barking (Dame Margaret Hodge) covered nicely the amendment on liability, and brought up the issue of hate, particularly when pointed towards the Jewish community. I thank her for consistently bringing that up. It is important to hear her voice and others on this issue.
Amendment 43 was tabled by me and my hon. Friend the Member for Ochil and South Perthshire (John Nicolson) and it regards a default toggle for material that we all agree is unsafe or harmful. The Labour party has said that it agrees with the amendment, and the SNP believes that the safest option should be the default option. We should start from a point of view that if anybody wants to see eating disorder content, or racist or incredibly harmful content that does not meet the bar of illegality, they should have to opt in to receive it. They should not see it by default; they should have to make that choice to see such content.
Freedom of speech is written into the Bill. People can say whatever they want as long as it is below that bar of illegality, but we should not have to read it. We should not have to read abuse that is pointed toward minority groups. We should start from the position of having the safest option on. We are trying to improve the permissive approach that the Government have arrived at, and this simple change is not controversial. It would require users to flip a switch if they want to opt in to some of the worst and most dangerous content available online, including pro-suicide, pro-anorexia or pro-bulimia content, rather than leaving that switch on by default.
If the Government want the terms and conditions to be the place where things are excluded or included, I think platforms should have to say, “We are happy to have pro-bulimia or pro-anorexia content.” They should have to make that clear and explicit in their terms of service, rather than having to say, “We do not allow x, y and z.” They should have to be clear, up front and honest with people, because then people would know what they are signing up to when they sign up to a website.
Amendment 44 is on habit forming features, and we have not spoken enough about the habit forming nature of social media in particular. Sites such as TikTok, Instagram and Facebook are set up to encourage people to spend time on them. They make money by encouraging people to spend as much time on them as possible—that is the intention behind them. We know that 42% of respondents to a survey by YoungMinds reported displaying signs of addiction-like behaviour when questioned about their social media habits. Young people are worried about that, and they do not necessarily have the tools to avoid it. We therefore tabled amendment 44 to take that into account, and to require platforms to consider that important issue.
New clause 3, on child user empowerment, was mentioned earlier. There is a bizarre loophole in the Bill requiring user empowerment toggles for adults but not for children. It is really odd not to require them for children when we know that they will be able to see some of this content and access features that are much more inherently dangerous to them than to adults. That is why we tabled amendments on private messaging features and live streaming features.
Live streaming is a place where self-generated child sexual abuse has shot through the roof. With child user empowerment, children would have to opt in, and they would have empowerment tools to allow them opportunities to say, “No, I don’t want to be involved in live streaming,” or to allow their parents to say, “No, I don’t want my child to be able to do live streaming when they sign up to Instagram. I don’t want them able to share live photos and to speak to people they don’t know.” Amendment 46, on private messaging features, would allow children to say, “No, I don’t want to get any private messages from anyone I don’t know.” That is not written into terms of service or in the Bill as a potentially harmful thing, but children should be able to exclude themselves from having such conversations.
We have been talking about the relationship between real life and the online world. If a child is playing in a play park and some stranger comes up and talks to them, the child is perfectly within their rights to say, “No, I’m not speaking to strangers. My parents have told me that, and it is a good idea not to speak to strangers,” but they cannot do that in the online world. We are asking for that to be taken into account and for platforms to allow private messaging and live streaming features to be switched off for certain groups of people. If they were switched off for children under 13, that would make Roblox, for example, a far safer place than it currently is.
I turn to amendment 84, on conversion therapy. I am glad that the amendment was tabled and that there are moves by the UK Government to bring forward the conversion therapy ban. As far as I am aware—I have been in the Chamber all day—we have not yet seen that legislation, but I am told that it will be coming. I pay tribute to all those who have worked really hard to get us to the position where the Government have agreed to bring forward a Bill. They are to be commended on that. I am sorry that it has taken this long, but I am glad that we are in that position. The amendment was massively helpful in that.
Lastly, I turn to amendment 50, on the risk of harm. One of the biggest remaining issues with the Bill is about the categorisation of platforms, which is done on the basis of their size and the risk of their features. The size of the platform—the number of users on it—is the key thing, but that fails to take into account very small and incredibly harmful platforms. The amendment would give Ofcom the power to categorise platforms that are incredibly harmful—incel forums, for example, and Kiwi Farms, set up entirely to dox trans people and put their lives at risk—as category 1 platforms and require them to meet all the rules, risk assessments and things for those platforms.
We should be asking those platforms to answer for what they are doing, no matter how few members they have or how small their user base. One person being radicalised on such a platform is one person too many. Amendment 50 is not an extreme amendment saying that we should ban all those platforms, although we probably should. It would ask Ofcom to have a higher bar for them and require them to do more.
I cannot believe that we are here again and that the Bill has taken so long to get to this point. I agree that the Bill is far from perfect, but it is better than nothing. The SNP will therefore not be voting against its Third Reading, because it is marginally better than the situation that we have right now.
Order. Things are not going quite according to plan, so colleagues might perhaps like to gear more towards five minutes as we move forward.
I rise to speak in favour of new clause 4, on minimum standards. In particular, I shall restrict my remarks to minimum standards in respect of incel culture.
Colleagues will know of the tragedy that took place in Plymouth in 2021. Indeed, the former Home Secretary, the right hon. Member for Witham (Priti Patel), visited Plymouth to meet and have discussions with the people involved. I really want to rid the internet of the disgusting, festering incel culture that is capturing so many of our young people, especially young men. In particular, I want minimum standards to apply and to make sure that, on big and small platforms where there is a risk, those minimum standards include the recognition of incel content. At the moment, incel content is festering in the darkest corners of the internet, where young men are taught to channel their frustrations into an insidious hatred of women and to think of themselves as brothers in arms in a war against women. It is that serious.
In Parliament this morning I convened a group of expert stakeholders, including those from the Centre for Countering Digital Hate, Tech Against Terrorism, Moonshot, Girlguiding, the Antisemitism Policy Trust and the Internet Watch Foundation, to discuss the dangers of incel culture. I believe that incel culture is a growing threat online, with real-world consequences. Incels are targeting young men, young people and children to swell their numbers. Andrew Tate may not necessarily be an incel, but his type of hate and division is growing and is very popular online. He is not the only one, and the model of social media distribution that my right hon. Friend the Member for Barking (Dame Margaret Hodge) spoke about incentivises hate to be viewed, shared and indulged in.
This Bill does not remove incel content online and therefore does not prevent future tragedies. As chair of the all-party parliamentary group on social media, I want to see minimum standards to raise the internet out of the sewer. Where is the compulsion for online giants such as Facebook and YouTube to remove incel content? Five of the most popular incel channels on YouTube have racked up 140,000 subscribers and 24 million views between them, and YouTube is still platforming four of those five. Why? How can these channels apparently pass YouTube’s current terms and conditions? The content is truly harrowing. In these YouTube videos, men who have murdered women are described as saints and lauded in incel culture.
We know that incels use mainstream platforms such as YouTube to reel in unsuspecting young men—so-called normies—before linking them to their own small, specialist websites that show incel content. This is called breadcrumbing: driving traffic and audiences from mainstream platforms to smaller platforms—which will be outside the scope of category 1 provisions and therefore any minimum standards—where individuals start their journey to incel radicalisation.
I think we need to talk less about freedom of speech and more about freedom of reach. We need to talk about enabling fewer and fewer people to see that content, and about down-ranking sites with appalling content like this to increase the friction to reduce audience reach. Incel content not only includes sexist and misogynist material; it also frequently includes anti-Semitic, racist, homophobic and transphobic items layered on top of one another. However, without a “legal but harmful” provision, the Bill does nothing to force search engines to downrate harmful content. If it is to be online, it needs to be harder and harder to find.
I do not believe that a toggle will be enough to deal with this. I agree with amendment 43—if we are to have a toggle, the default should be the norm—but I do not think a toggle will work because it will be possible to evade it with a simple Google Chrome extension that will auto-toggle and therefore make it almost redundant immediately. It will be a minor inconvenience, not a game changer. Some young men spent 10 hours a day looking at violent incel content online. Do we really think that a simple button, a General Data Protection Regulation annoyance button, will stop them from doing so? It will not, and it will not prevent future tragedies.
However, this is not just about the effect on other people; it is also about the increase in the number of suicides. One of the four largest incel forums is dedicated to suicide and self-harm. Suicide is normalised in the forum, and is often referred to as “catching the bus.” People get together to share practical advice on how they can take their own lives. That is not content to which we should be exposing our young people, but it is currently legal. It is harmful, but it will remain legal under the Bill because the terms and conditions of those sites are written by incels to promote incel content. Even if the sites were moved from category 2 to category 1, they would still pass the tests in the Bill, because the incels have written the terms and conditions to allow that content.
Why are smaller platforms not included in the Bill? Ofcom should have the power to bring category 2 sites into scope on the basis of risk. Analysis conducted by the Center for Countering Digital Hate shows that on the largest incel website, rape is mentioned in posts every 29 minutes, with 89% of those posts referring to it in a positive sense. Moreover, 50% of users’ posts about child abuse on the same site are supportive of paedophilia. Indeed, the largest incel forum has recently changed its terms and conditions to allow mention of the sexualisation of pubescent minors—unlike pre-pubescent minors; it makes that distinction. This is disgusting and wrong, so why is it not covered in the Bill? I think there is a real opportunity to look at incel content, and I would be grateful if the Minister met the cross-party group again to discuss how we can ensure that it is harder and harder to find online and is ultimately removed, so that we can protect all our young people from going down this path.
In order to ensure that we get everybody in, I am going to introduce a five-minute time limit. I call Richard Burgon.
I have listened with interest to all the powerful speeches that have been made today. As legislation moves through Parliament, it is meant to be improved, but the great pity with this Bill is that it has got worse, not better. It is a real tragedy that measures protecting adults from harmful but legal content have been watered down.
I rise to speak against the amendments that have come from the Government, including amendments 11 to 14 and 18 and 19, which relate to the removal of adult safety duties. I am also speaking in favour of new clause 4 from the Labour Front Bench team and amendment 43 from the SNP, which go at least some of the way to protect adults from harmful but legal content.
Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Winterton of Doncaster
Main Page: Baroness Winterton of Doncaster (Labour - Life peer)Department Debates - View all Baroness Winterton of Doncaster's debates with the Department for Science, Innovation & Technology
(1 year, 3 months ago)
Commons ChamberI have three more speakers. I ask that colleagues bear that in mind so that I can bring in the Minister.
I would like to mention a very long journey in relation to the protection of children, because to my mind that is right at the heart of the Bill’s social value. I think it was Disraeli who said:
“The youth of a nation are the trustees of posterity.”
If we get it right in the early stages of their lives and we provide legislation that enables them to be properly protected, we are likely to get things right for the future. The Bill does that in a very good way.
The Bill also reflects some of the things in which I found myself involved in 1977—just over 45 years ago—with the Protection of Children Bill when Cyril Townsend came top of the private Member’s Bill ballot. I mention that because at that time we received resistance from Government Ministers and others—I am afraid I must say that it was a Labour Minister—but we got the Bill through as the then Prime Minister James Callaghan eventually ensured it did so. His wife insisted on it, as a matter of fact.
I pay tribute to the House of Lords. Others have repeatedly mentioned the work of Baroness Kidron, but I would also like to mention Lord Bethell, Baroness Morgan and others, because it has been a combined effort. It has been Parliament at its best. I have heard others, including my hon. Friend the Member for Folkestone and Hythe (Damian Collins) and my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), make that point. It has been a remarkably lengthy but none the less essential process, and I pay tribute to those people for what they have done.
In retrospect, I would like to mention Baroness Lucy Faithfull, because back in 1977-78 I would not have known what to do if she had not worked relentlessly in the House of Lords to secure the measures necessary to protect children from sexual images and pornographic photography—it was about assault, and I do not need to go into the detail. The bottom line is that it was the first piece of legislation that swung the pendulum towards common sense and proportionality in matters that, 45 years later, have culminated in what has been discussed in the Bill and the amendments today.
I pay tribute to Ian Russell and to the others here whose children have been caught up in this terrible business. I pay specific tribute to the Secretary of State and the Minister, and also the Health Secretary for his statement yesterday about a national suicide strategy, in which he referenced amendments to the Bill. Because I have had a lot to do with him, I would like to pay tribute to Richard Collard of the National Society for the Prevention of Cruelty to Children, who has not been mentioned yet, for working so hard and effectively.
I pay tribute to my hon. Friend the Member for Penistone and Stocksbridge (Miriam Cates) for her work to help get the amendments through. The written ministerial statement came after some interesting discussions with the Minister, who was a bit surprised by our vehemence and determination. It was not chariots of fire but chariots on fire, and within three weeks, by the time the Bill got to the House of Lords, we had a written ministerial statement that set the tone for the part of the Bill that I discussed just now, to protect children because they need protection at the right time in their lives.
The NSPCC tells us that 86% of UK adults want companies to understand how groomers and child abusers use their sites to harm children, and want action to prevent it by law. I came up with the idea, although the right hon. Member for Barking (Dame Margaret Hodge) gave us a lot of support in a debate in this House at the time, and I am grateful to her for that. The fact that we are able to come forward with this legislation owes a great deal to a lot of people from different parts of the House.
I very much accept that continuing review is necessary. Many ideas have been put forward in this debate, and I am sure that the Minister is taking them all on board and will ensure that the review happens and that Ofcom acts accordingly, which I am sure it will want to. It is important that that is done.
I must mention that the fact we have left the European Union has enabled us to produce legislation to protect children that is very significantly stronger than European Union legislation. The Digital Services Act falls very far short of what we are doing here. I pay tribute to the Government for promoting ideas based on our self-government to protect our voters’ children and our society. That step could only have been taken now that we have left the European Union.
Research by the NSPCC demonstrates that four in five victims of online grooming offences are girls. It is worth mentioning that, because it is a significant piece of research. That means that there has to be clear guidance about the types of design that will be incorporated by virtue of the discussions to be had about how to make all this legislation work properly.
The only other thing I would like to say is that the £10-million suicide prevention grant fund announced yesterday complements the Bill very well. It is important that we have a degree of symmetry between legislation to prevent suicide and to ensure that children are kept safe.