OFCOM (Duty regarding Prevention of Serious Self-harm and Suicide) Bill [HL] Debate
Full Debate: Read Full DebateLord Parkinson of Whitley Bay
Main Page: Lord Parkinson of Whitley Bay (Conservative - Life peer)Department Debates - View all Lord Parkinson of Whitley Bay's debates with the Department for Digital, Culture, Media & Sport
(1 year, 9 months ago)
Lords ChamberMy Lords, I am very grateful to the noble Baroness, Lady Finlay of Llandaff, for bringing forward her Bill, and to all noble Lords who have taken part in our debate, most particularly the noble Baroness, Lady Smith of Newnham, whose powerful, brave and personal words moved us all but also underlined the importance for so many families of the topic we are discussing today. The Government fully understand just how devastating these harms are, both to children and to adults, and the effect that those harms have on their families and friends, as well as the role that social media platforms and search engines can play in exacerbating them.
As the noble Baroness, Lady Finlay, outlined, her Bill was due to be read a second time the day after the death of Her late Majesty the Queen. That very sad reason for delay has meant that we are able to look at it alongside the Online Safety Bill, now before your Lordships’ House, which is helpful. I will endeavour to explain why the Government think that Bill deals with many of the issues raised, while keeping an open mind, as I said at its Second Reading on Wednesday, on suggestions for how it could do so more effectively.
I will first address the scope and intentions of the Online Safety Bill, particularly how it protects adults and children from horrific content such as this. As I outlined in our debate on Wednesday, the Online Safety Bill offers adult users a triple shield of protection, striking a balance between forcing platforms to be transparent about their actions and empowering adult users with tools to manage their experience online.
The first part of the shield requires all companies in scope of the Bill to tackle criminal activity online when it is flagged to them. They will have duties proactively to tackle priority illegal content and will need to prevent their services being used to facilitate the priority offences listed in the Bill, which include encouraging or assisting suicide.
The second part of the shield requires the largest user-to-user platforms, category 1 services under the Bill, to ensure that any terms of service they set are properly enforced. For instance, if a major social media platform says in its terms of service that it does not allow harmful suicide content, it must adhere to that. I will address this in greater detail in a moment, but Ofcom will have the power to hold platforms to their terms and conditions, which will help to create a safer, more transparent environment for all.
The third part of the shield requires category 1 services to provide adults with tools either to reduce the likelihood of encountering certain categories of content, if they so choose, or to alert them to the nature of that content. That includes content that encourages, promotes or provides instruction for suicide, self-harm or eating disorders. People will also have the ability to filter out content from unverified accounts, if they wish. That will give them the power to address the concern raised by my noble friend Lord Balfe about anonymous accounts. If anonymous accounts are pushing illegal content, the police already have powers through the Investigatory Powers Act to access communications data to bring the people behind that to book.
Through our triple shield, adult users will be empowered to make more informed choices about the services they use and have greater control over whom and what they engage with online.
As noble Lords know, child safety is a crucial component of the Online Safety Bill, and protecting children from harm remains our priority. As well as protecting children from illegal material, such as intentional encouragement of or assistance in suicide, all in-scope services likely to be accessed by children will be required to assess the risks to children on their service, and to provide safety measures to protect them from age-inappropriate and harmful content. This includes content promoting suicide, eating disorders and self-harm that does not meet a criminal threshold, as well as harmful behaviour such as cyberbullying.
Providers will also be required to consider, as part of their risk assessments, how functions such as algorithms could affect children’s exposure to illegal and other harmful content on their service. They must take steps to mitigate and manage any risks. Finally, providers may need to use age-assurance measures to identify the age of their users, to meet the child safety duties and to enforce age restrictions on their service.
A number of noble Lords talked about algorithms, so I will say a little more about that, repeating what I outlined on Wednesday. Under the Online Safety Bill, companies will need to take steps to mitigate the harm associated with their algorithms. That includes ensuring that algorithms do not promote illegal content, ensuring that predictive searches do not drive children towards harmful content and signposting children who search for harmful content towards resources and support.
Ofcom will also be given a range of powers to help it assess whether companies are fulfilling their duties in relation to algorithms. It will have powers to require information from companies about the operation of their algorithms, to interview employees, to require regulated service providers to undergo a skilled persons report, and to require audits of companies’ systems and processes. It will also have the power to inspect premises and access data and equipment, so the Bill is indeed looking at the harmful effects of algorithms.
Moreover, I am pleased that many of the ambitions that lie behind the noble Baroness’s Bill will be achieved through a new communications offence that will capture the intentional encouragement and assistance of self-harm, as noble Lords have highlighted today. That new offence will apply to all victims, adults as well as children, and is an important step forward in tackling such abhorrent content. The Government are considering how that offence should be drafted. We are working with colleagues at the Ministry of Justice and taking into account views expressed by the Law Commission. As I said on Wednesday, our door remains open and I am keen to discuss this with noble Lords from all parties and none to ensure we get this right. We look forward to further conversations with noble Lords between now and Committee.
Finally, I want briefly to mention how in our view the aims of the noble Baroness’s Bill risk duplicating some of the work the Government are taking forward in these areas. The Bill proposes requiring Ofcom to establish a unit to advise the Secretary of State on the use of user-to-user platforms and search engines to encourage and assist serious self-harm and activities associated with the risk of suicide. The unit’s advice would focus on the extent of harmful content, the effectiveness of current regulation and potential changes in regulation to help prevent these harms. The noble Baroness is right to raise the issue, and I think her Bill is intended to complement the Online Safety Bill regime to ensure that it remains responsive to the way in which specific harms develop over time.
On Wednesday we heard from my noble friend Lord Sarfraz about some of the emerging threats, but I hope I have reassured the noble Baroness and other noble Lords that suicide and self-harm content will be robustly covered by the regime that the Online Safety Bill sets up. It is up to Ofcom to determine how best to employ its resources to combat these harms effectively and swiftly. For instance, under the Online Safety Bill, Ofcom is required to build and maintain an in-depth understanding of the risks posed by in-scope services, meaning that the regime the Bill brings forward will remain responsive to the ways in which harms manifest themselves both online and offline, such as in cases of cyberstalking or cyberbullying.
The Government believe that Ofcom as the regulator is best placed to hold providers accountable and to respond to any failings in adhering to their codes of practice. It has the expertise to regulate and enforce the Online Safety Bill’s provisions and to implement the findings of its own research. Its work as the regulator will also consider evidence from experts across the sector, such as Samaritans, which has rightly been named a number of times today and kindly wrote to me ahead of this debate and our debate on the Online Safety Bill. We therefore think that this work covers the same ground as the advisory function of the unit proposed in the noble Baroness’s Bill, and I hope this has reassured her that the area that she highlights through it is indeed being looked at in the Government’s Bill.
That is why the Government believe that the Online Safety Bill now before your Lordships’ House represents the strong action that we need to prevent the encouragement or assistance of self-harm, suicide and related acts online, and why we think it achieves the same objectives as the noble Baroness’s Bill. It is further strengthened, as I say, by the new stand-alone offence that we are bringing forward which addresses communications that intentionally encourage or assist self-harm, about which I am happy to speak to noble Lords.
I am glad we have had the opportunity today, between Second Reading and Committee of that Bill, to look at this issue in detail, and I know we will continue to do so, both inside and outside the Chamber. For the reasons I have given, though, we cannot support the noble Baroness’s Private Member’s Bill today.