Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Morgan of Cotes
Main Page: Baroness Morgan of Cotes (Non-affiliated - Life peer)Department Debates - View all Baroness Morgan of Cotes's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberI am particularly grateful to the noble Lords who co-signed Amendments 96, 240 and 296 in this group. Amendment 225 is also important and warrants careful consideration, as it explicitly includes eating disorders. These amendments have strong support from Samaritans, which has helped me in drafting them, and from the Mental Health Foundation and the BMA. I declare that I am an elected member of the BMA ethics committee.
We have heard much in Committee about the need to protect children online more effectively even than in the Bill. On Tuesday the noble Baroness, Lady Morgan of Cotes, made a powerful speech acknowledging that vulnerability does not stop at the age of 18 and that the Bill currently creates a cliff edge whereby there is protection from harmful content for those under 18 but not for those over 18. The empowerment tools will be futile for those seriously contemplating suicide and self-harm. No one should underestimate the power of suicide contagion and the addictive nature of the content that is currently pushed out to people, goading them into such actions and drawing them into repeated viewings.
Amendment 96 seeks to redress that. It incorporates a stand-alone provision, creating a duty for providers of user-to-user services to manage harmful content about suicide or self-harm. This provision would operate as a specific category, relevant to all regulated services and applicable to both children and adults. Amendment 296 defines harmful suicide or self-harm content. It is important that we define that to avoid organisations such as Samaritans, which provide suicide prevention support, being inadvertently caught up in clumsy, simplistic search engine categorisation.
Suicide and self-harm content affects people of all ages. Adults in distress search the internet, and children easily bypass age-verification measures and parental controls even when the have been switched on. The Samaritans Lived Experience Panel reported that 82% of people who died by suicide, having visited websites that encouraged suicide and/or methods of self-harm, were over the age of 25.
Samaritans considers that the types of suicide and self-harm content that are legal but unequivocally harmful include, but are not limited to, information, depictions, instructions and advice on methods of self-harm and suicide; content that portrays self-harm and suicide as positive or desirable; and graphic descriptions or depictions of self-harm and suicide. As the Bill stands, platforms will not even need to consider the risk that such content could pose to adults. This will leave all that dangerous online content widely available and undermines the Bill’s intention from the outset.
Last month, other parliamentarians and I met Melanie, whose relative Jo died by suicide in 2020. He was just 23. He had accessed suicide-promoting content online, and his family are speaking out to ensure that the Bill works to avoid future tragedies. A University of Bristol study reported that those with severe suicidal thoughts actively use the internet to research effective methods and often find clear suggestions. Swansea University reported that three quarters of its research participants had harmed themselves more severely after viewing self-harm content online.
Amendment 240 complements the other amendments in this group, although it would not rely on them to be effective. It would establish a specific unit in Ofcom to monitor the prevalence of suicide, self-harm and harmful content online. I should declare that this is in line with the Private Member’s Bill I have introduced. In practice, that means that Ofcom would need to assess the efficacy of the legislation in practice. It would require Ofcom to investigate the content and the algorithms that push such content out to individuals at an alarming rate.
Researchers at the Center for Countering Digital Hate set up new accounts in the USA, UK, Canada and Australia at the minimum age TikTok allows, which is 13. These accounts paused briefly on videos about body image and mental health, and “liked” them. Within 2.6 minutes, TikTok recommended suicide content, and it sent content on eating disorders within eight minutes.
Ofcom’s responsibility for ongoing review and data collection, reported to Parliament, would take a future-facing approach covering new technologies. New communications and internet technologies are being developed at pace in ways we cannot imagine. The term
“in a way equivalent … to”
in Amendment 240 is specifically designed to include the metaverse, where interactions are instantaneous, virtual and able to incite, encourage or provoke serious harm to others.
We increasingly live our lives online. Social media is expanding, while user-to-user sites are now shopping platforms for over 70% of UK consumers. However, online is also being used to sell suicide kits or lethal substances, as recently covered in the press. It is important that someone holds the responsibility for reporting on dangers in the online world. Harmful suicide content methods and encouragement were found through a systematic review to be massed on sites with low levels of moderation and easy search functions for images. Some 78% of people with lived experience of suicidality and self-harm surveyed by Samaritans agree that new laws are needed to make online spaces safer.
I urge noble Lords to support my amendments, which aim to ensure that self-harm, suicide and seriously harmful content is addressed across all platforms in all categories as well as search engines, regardless of their functionality or reach, and for all persons, regardless of age. Polling by Samaritans has shown high support for this: four out of five agree that harmful suicide and self-harm content can damage adults as well as children, while three-quarters agree that tech companies should by law prevent such content being shown to users of all ages.
If the Government are not minded to adopt these amendments, can the Minister tell us specifically how the Bill will take a comprehensive approach to placing duties on all platforms to reduce dangerous content promoting suicide and self-harm? Can the Government confirm that smaller sites, such as forums that encourage suicide, will need to remove priority illegal content, whatever the level of detail in their risk assessment? Lastly—I will give the Minister a moment to note my questions—do the Government recognise that we need an amendment on Report to create a new offence of assisting or encouraging suicide and serious self-harm? I beg to move.
My Lords, I particularly support Amendment 96, to which I have added my name; it is a privilege to do so. I also support Amendment 296 and I cannot quite work out why I have not added my name to it, because I wholeheartedly agree with it, but I declare my support now.
I want to talk again about an issue that the noble Baroness, Lady Finlay, set out so well and that we also touched on last week, about the regulation of suicide and self-harm content. We have all heard of the tragic case of Molly Russell, but a name that is often forgotten in this discussion is Frankie Thomas. Frankie was a vulnerable teenager with childhood trauma, functioning autism and impulsivity. After reading a story about self-harm on the app Wattpad, according to the coroner’s inquest, she went home and undertook
“a similar act, resulting in her death”.
I do not need to repeat the many tragic examples that have already been shared in this House, but I want to reiterate the point already made by the BMA in its very helpful briefing on these amendments: viewing self-harm and suicide content online can severely harm the user offline. As I said last week when we were debating the user empowerment tools, this type of content literally has life or death repercussions. It is therefore essential that the Bill takes this sort of content more seriously and creates specific duties for services to adhere to.
We will, at some point this evening—I hope—come on to debate the next group of amendments. The question for Ministers to answer on this group, the next one and others that we will be debating is, where we know that content is harmful to society—to individuals but also to broader society—why the Government do not want to take the step of setting out how that content should be properly regulated. I think it all comes from their desire to draw a distinction between content that is illegal and content that is not illegal but is undoubtedly, in the eyes of pretty well every citizen, deeply harmful. As we have already heard from the noble Baroness, and as we heard last week, adults do not become immune to suicide and self-harm content the minute they turn 18. In fact, I would argue that no adult is immune to the negative effects of viewing this type of content online.
This amendment, therefore, is very important, as it would create a duty for providers of regulated user-to-user services and search engines to manage harmful suicide or self-harm content applicable to both children and adults, recognising this cliff edge otherwise in the Bill, which we have already talked about. I strongly urge noble Lords, particularly the Minister, to agree that protecting users from this content is one of the most important things that the Bill can do. People outside this House are looking to us to do this, so I urge the Government to support this amendment today.
My Lords, I am pleased that we have an opportunity, in this group of amendments, to talk about suicide and self-harm content, given the importance of it. It is important to set out what we expect to happen with this legislation. I rise particularly to support Amendment 225, to which my noble friend Lady Parminter added her name. I am doing this more because the way in which this kind of content is shared is incredibly complex, rather than simply because of the question of whether it is legal or illegal.
My Lords, it is a great pleasure to move Amendment 97 and speak to Amendment 304, both standing in my name and supported by the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester and the noble Lord, Lord Knight of Weymouth. I am very grateful for their support. I look forward to hearing the arguments by the noble Lord, Lord Stevenson, for Amendment 104 as well, which run in a similar vein.
These amendments are also supported by the Domestic Abuse Commissioner, the Revenge Porn Helpline, BT, EE and more than 100,000 UK citizens who have signed End Violence Against Women’s petition urging the Government to better protect women and girls in the Bill.
I am also very grateful to the noble Baroness, Lady Foster of Aghadrumsee—I know I pronounced that incorrectly—the very distinguished former Northern Ireland politician. She cannot be here to speak today in favour of the amendment but asked me to put on record her support for it.
I also offer my gratitude to the End Violence Against Women Coalition, Glitch, Refuge, Carnegie UK, NSPCC, 5Rights, Professor Clare McGlynn and Professor Lorna Woods. Between them all, they created the draft violence against women and girls code of practice many months ago, proving that a VAWG code of practice is not only necessary but absolutely deliverable.
Much has already been said on this, both here and outside the Chamber. In the time available, I will focus my case for these amendments on two very specific points. The first is why VAWG, violence against women and girls, should have a specific code of practice legislated for it, rather than other content we might debate. The second is what having a code of practice means in relation to the management of that content.
Ofcom has already published masses of research showing that abuse online is gendered. The Government’s own fact sheet, sent to us before these debates, said that women and girls experience disproportionate levels of abuse online. They experience a vast array of abuse online because of their gender, including cyberflashing, harassment, rape threats and stalking. As we have already heard and will continue to hear in these debates, some of those offences and abuse reach a criminal threshold and some do not. That is at the heart of this debate.
My Lords, protecting women and girls is a priority for His Majesty’s Government, at home, on our streets and online. This Bill will provide vital protections for women and girls, ensuring that companies take action to improve their safety online and protect their freedom of expression so that they can continue to play their part online, as well as offline, in our society.
On Amendments 94 and 304, tabled by my noble friend Lady Morgan of Cotes, I want to be unequivocal: all service providers must understand the systemic risks facing women and girls through their illegal content and child safety risk assessments. They must then put in place measures that manage and mitigate these risks. Ofcom’s codes of practice will set out how companies can comply with their duties in the Bill.
I assure noble Lords that the codes will cover protections against violence against women and girls. In accordance with the safety duties, the codes will set out how companies should tackle illegal content and activity confronting women and girls online. This includes the several crimes that we have listed as priority offences, which we know are predominantly perpetrated against women and girls. The codes will also cover how companies should tackle harmful online behaviour and content towards girls.
Companies will be required to implement systems and processes designed to prevent people encountering priority illegal content and minimise the length of time for which any such content is present. In addition, Ofcom will be required to carry out broad consultation when drafting codes of practice to harness expert opinions on how companies can address the most serious online risks, including those facing women and girls. Many of the examples that noble Lords gave in their speeches are indeed reprehensible. The noble Baroness, Lady Kidron, talked about rape threats and threats of violence. These, of course, are examples of priority illegal content and companies will have to remove and prevent them.
My noble friend Lady Morgan suggested that the Bill misses out the specific course of conduct that offences in this area can have. Clause 9 contains provisions to ensure that services
“mitigate and manage the risk of the service being used for the commission or facilitation of”
an offence. This would capture patterns of behaviour. In addition, Schedule 7 contains several course of conduct offences, including controlling and coercive behaviour, and harassment. The codes will set out how companies must tackle these offences where this content contributes to a course of conduct that might lead to these offences.
To ensure that women’s and girls’ voices are heard in all this, the Bill will, as the right reverend Prelate noted, make it a statutory requirement for Ofcom to consult the Victims’ Commissioner and the domestic abuse commissioner about the formation of the codes of practice. As outlined, the existing illegal content, child safety and child sexual abuse and exploitation codes will already cover protections for women and girls. Creating a separate code dealing specifically with violence against women and girls would mean transposing or duplicating measures from these in a separate code.
In its recent communication to your Lordships, Ofcom stated that it will be consulting quickly on the draft illegal content and child sexual abuse and exploitation codes, and has been clear that it has already started the preparatory work for these. If Ofcom were required to create a separate code on violence against women and girls this preparatory work would need to be revised, with the inevitable consequence of slowing down the implementation of these vital protections.
An additional stand-alone code would also be duplicative and could cause problems with interpretation and uncertainty for Ofcom and providers. Linked to this, the simpler the approach to the codes, the higher the rates of compliance are likely to be. The more codes there are covering specific single duties, the more complicated it will be for providers, which will have to refer to multiple different codes, and the harder for businesses to put in place the right protections for users. Noble Lords have said repeatedly that this is a complex Bill, and this is an area where I suggest we should not make it more complex still.
As the Bill is currently drafted, Ofcom is able to draft codes in a way that addresses a range of interrelated risks affecting different groups of users, such as people affected in more than one way; a number of noble Lords dealt with that in their contributions. For example, combining the measures that companies can take to tackle illegal content targeting women and girls with the measures they can take to tackle racist abuse online could ensure a more comprehensive and effective approach that recognises the point, which a number of noble Lords made, that people with more than one protected characteristic under the Equality Act may be at compound risk of harm. If the Bill stipulated that Ofcom separate the offences that disproportionately affect women and girls from other offences in Schedule 7, this comprehensive approach to tackling violence against women and girls online could be lost.
Could my noble friend the Minister confirm something? I am getting rather confused by what he is saying. Is it the case that there will be just one mega code of practice to deal with every single problem, or will there be lots of different codes of practice to deal with the problems? I am sure the tech platforms will have sufficient people to be able to deal with them. My understanding is that Ofcom said that, while the Bill might not mandate a code of practice on violence against women and girls, it would in due course be happy to look at it. Is that right, or is my noble friend the Minister saying that Ofcom will never produce a code of practice on violence against women and girls?
It is up to Ofcom to decide how to set the codes out. What I am saying is that the codes deal with specific categories of threat or problem—illegal content, child safety content, child sexual abuse and exploitation—rather than with specific audiences who are affected by these sorts of problems. There is a circularity here in some of the criticism that we are not reflecting the fact that there are compound harms to people affected in more than one way and then saying that we should have a separate code dealing with one particular group of people because of one particular characteristic. We are trying to deal with categories of harm that we know disproportionately affect women and girls but which of course could affect others, as the noble Baroness rightly noted. Amendment 304—
There are no codes designed for Jewish people, Muslim people or people of colour, even though we know that they are disproportionately affected by some of these harms as well. The approach taken is to tackle the problems, which we know disproportionately affect all of those groups of people and many more, by focusing on the harms rather than the recipients of the harm.
Can I check something with my noble friend? This is where the illogicality is. The Government have mandated in the Strategic Policing Requirement that violence against women and girls is a national threat. I do not disagree with him that other groups of people will absolutely suffer abuse and online violence, but the Government themselves have said that violence against women and girls is a national threat. I understand that my noble friend has the speaking notes, the brief and everything else, so I am not sure how far we will get on this tonight, but, given the Home Office stance on it, I think that to say that this is not a specific threat would be a mistake.
With respect, I do not think that that is a perfect comparison. The Strategic Policing Requirement is an operational policing document intended for chief constables and police and crime commissioners in the important work that they do, to make sure they have due regard for national threats as identified by the Home Secretary. It is not something designed for commercial technology companies. The approach we are taking in the Bill is to address harms that can affect all people and which we know disproportionately affect women and girls, and harms that we know disproportionately affect other groups of people as well.
We have made changes to the Bill: the consultation with the Victims’ Commissioner and the domestic abuse commissioner, the introduction of specific offences to deal with cyber-flashing and other sorts of particular harms, which we know disproportionately affect women and girls. We are taking an approach throughout the work of the Bill to reflect those harms and to deal with them. Because of that, respectfully, I do not think we need a specific code of practice for any particular group of people, however large and however disproportionately they are affected. I will say a bit more about our approach. I have said throughout, including at Second Reading, and my right honourable friend the Secretary of State has been very clear in another place as well, that the voices of women and girls have been heard very strongly and have influenced the approach that we have taken in the Bill. I am very happy to keep talking to noble Lords about it, but I do not think that the code my noble friend sets out is the right way to go about solving this issue.
Amendment 304 seeks to adopt the Istanbul convention definition of violence against women and girls. The Government are already compliant with the Convention on Preventing and Combating Violence Against Women and Domestic Violence, which was ratified last year. However, we are unable to include the convention’s definition of violence against women and girls in the Bill, as it extends to legal content and activity that is not in scope of the Bill as drafted. Using that definition would therefore cause legal uncertainty for companies. It would not be appropriate for the Government to require companies to remove legal content accessed by adults who choose to access it. Instead, as noble Lords know, the Government have brought in new duties to improve services’ transparency and accountability.
Amendment 104 in the name of the noble Lord, Lord Stevenson, seeks to require user-to-user services to provide a higher standard of protection for women, girls and vulnerable adults than for other adults. The Bill already places duties on service providers and Ofcom to prioritise responding to content and activity that presents the highest risk of harm to users. This includes users who are particularly affected by online abuse, such as women, girls and vulnerable adults. In overseeing the framework, Ofcom must ensure that there are adequate protections for those who are most vulnerable to harm online. In doing so, Ofcom will be guided by its existing duties under the Communications Act, which requires it to have regard when performing its duties to the
“vulnerability of children and of others whose circumstances appear to OFCOM to put them in need of special protection”.
The Bill also amends Ofcom’s general duties under the Communications Act to require that Ofcom, when carrying out its functions, considers the risks that all members of the public face online, and ensures that they are adequately protected from harm. This will form part of Ofcom’s principal duty and will apply to the way that Ofcom performs all its functions, including when producing codes of practice.
In addition, providers’ illegal content and child safety risk assessment duties, as well as Ofcom’s sectoral risk assessment duties, require them to understand the risk of harm to users on their services. In doing so, they must consider the user base. This will ensure that services identify any specific risks facing women, girls or other vulnerable groups of people.
As I have mentioned, the Bill will require companies to prioritise responding to online activity that poses the greatest risk of harm, including where this is linked to vulnerability. Vulnerability is very broad. The threshold at which somebody may arguably become vulnerable is subjective, context-dependent and maybe temporary. The majority of UK adult users could be defined as vulnerable in particular circumstances. In practice, this would be very challenging for Ofcom to interpret if it were added to the safety objectives in this way. The existing approach allows greater flexibility so that companies and Ofcom can focus on the greatest threats to different groups of people at any given time. This allows the Bill to adapt to and keep pace with changing risk patterns that may affect different groups of people.
My Lords, I thank my noble friend for his response, which I will come on to in a moment. This has been a fascinating debate. Yet again, it has gone to the heart of some of the issues with this Bill. I thank all noble Lords who have spoken, even though I did not quite agree with everything they said. It is good that this Committee shows just how seriously it takes the issue of violence against women and girls. I particularly thank all those who are watching from outside. This issue is important to so many.
There is no time to run through all the brilliant contributions that have been made. I thank the right reverend Prelate the Bishop of Gloucester for her support. She made the point that, these days, for most people, there is no online/offline distinction. To answer one of the points made, we sometimes see violence or abuse that starts online and then translates into the offline world. Teachers in particular are saying that this is the sort of misogyny they are seeing in classrooms.
As the noble Baroness, Lady Merron, said, the onus should not be on women and girls to remove themselves from online spaces. I also thank the noble Baronesses, Lady Kidron and Lady Gohir, for their support. The noble Baroness, Lady Kidron, talked about the toxic levels of online violence. Parliament needs to say that this is not okay—which means that we will carry on with this debate.
I thank the noble Baroness, Lady Healy, for her contribution. She illustrated so well why a code of practice is needed. We can obviously discuss this, but I do not think the Minister is quite right about the user reporting element. For example, we have heard various women speaking out who have had multiple rape threats. At the moment, the platforms require each one to be reported individually. They do not put them together and then work out the scale of threat against a particular user. I am afraid that this sort of threat would not breach the illegal content threshold and therefore would not be caught by the Bill, despite what the Minister has been saying.
I agree with my noble friend Lady Stowell. I would love to see basic standards—I think she called it “civility” —and a better society between men and women. One of the things that attracts me most to the code of practice is that it seeks cultural and societal changes—not just whack-a-mole with individual offences but changing the whole online culture to build a healthier and better society.
I will certainly take up the Minister’s offer of a meeting. His response was disappointing. There was no logic to it at all. He said that the voice of women and girls is heard throughout the Bill. How can this be the case when the very phrase “women and girls” is not mentioned in 262 pages? Some 100,000 people outside this Chamber disagree with his position and on the need for there to be a code of practice. I say to both Ofcom and the tech platforms that a code has been drafted. Please do not do the “Not drafted here; we’re not going to adopt it”. It is there, the work has been done and it can easily be taken on.
I would be delighted to discuss the definition in Amendment 304 with my noble friend. I will of course withdraw my amendment tonight, but we will certainly return to this on Report.