Internet Service Providers and Suicide-related Content Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Internet Service Providers and Suicide-related Content

Feryal Clark Excerpts
Wednesday 18th December 2024

(1 day, 19 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Feryal Clark Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Feryal Clark)
- View Speech - Hansard - -

I thank the hon. Member for Leeds East (Richard Burgon) for opening the debate and all other colleagues who have contributed. I know that this issue will be close to the hearts of many of us, because it is about protecting the safety of everyone, including our children and young people.

This evening I want to talk about why this issue matters and what the Online Safety Act will do about it. First, I would like to share my deepest sympathies with family and friends of Joe Nihill—a 23-year-old man who ended his life after finding suicide-related content online. Unfortunately, stories such as Joe’s are not uncommon—we have heard about Tom, a 22-year-old young man, who also died from suicide. As part of our work in online safety we speak to groups that have campaigned for years for a safer internet, often led by bereaved families. I thank Joe’s mother Catherine, his sister-in-law Melanie and all the bereaved families for their tireless work. We continue to listen to their expertise in this conversation.

People who are thinking about ending their lives or hurting themselves might turn to the internet as a place of refuge. All too often, what they find instead is content encouraging them not to seek help. That deluge of content has a real-world impact. Suicide-related internet use is a factor in around a quarter of deaths by suicide among people aged 10 to 19 in the UK—at least 43 deaths a year. Lots of research in this area focuses on children, but it is important to recognise that suicide-related internet use can be a factor in suicide in all age groups. These harms are real, and tackling them must be a collective effort.

On the hon. Member’s first point, we welcome efforts by all companies, including internet service providers, to tackle illegal content so that no more lives are tragically lost to suicide. Online safety forms a key pillar of the Government’s suicide prevention strategy. However, we are clear that the principal responsibility sits squarely with those who post such hateful content, and the site where it is allowed to fester—sites that, until now, have not been made to face the consequences. The Online Safety Act has been a long time coming. A decade of delay has come at a tragic human cost, but change is on its way. On Monday, Ofcom published its draft illegal harms codes under the Online Safety Act, which are a step change.

On the hon. Member’s second point, I can confirm that from next spring, for the first time, social media platforms and search engines will have to look proactively for and take down illegal content. These codes will apply to sites big and small. If services do not comply they could be hit by massive fines, or Ofcom could, with the agreement of the courts, use business disruption measures —court orders that mean that third parties have to withdraw their services or restrict or block access to non-compliant services in the UK. We have made intentionally encouraging or assisting suicide a priority offence under the Act. That means that all providers, no matter their size, will have to show that they are taking steps to stop their sites being used for such content.

The strongest protection in the Act’s frameworks are for children, so on the hon. Member’s third point, I assure him that under the draft child safety codes, any site that allows content that promotes self-harm, eating disorders or suicide will now have to use highly effective age limits to stop children from accessing such content. Some sites will face extra duties. We have laid the draft regulations setting out the threshold conditions for category 1, 2A and 2B services under the Act. Category 1 sites are those that have the ability to spread content easily, quickly and widely. They will have to take down content if it goes against their terms of services, such as posts that could encourage self-harm or eating disorders. They will also have to give adult users the tools to make it less likely they will see content that they do not want to see, or will alert them to the nature of potentially harmful content.

A suicide forum will be unlikely to have terms of services that restrict legal suicide content, and users of these sites are unlikely to want to use tools that make it less likely they will see such content. However, that absolutely does not mean that such forums—what people call “small but risky” sites—can go unnoticed.

--- Later in debate ---
Motion made, and Question proposed, That this House do now adjourn.—(Taiwo Owatemi.)
Feryal Clark Portrait Feryal Clark
- Hansard - -

Every site, whether it has five users or 500 million users, will have to proactively remove illegal content, such as content where there is proven intent of encouraging someone to end their life. Ofcom has also set up a “small but risky” supervision taskforce to ensure that smaller forums comply with new measures, and it is ready to take enforcement action if they do not do so. The Government understand that just one person seeing this kind of content could mean one body harmed, one life ended, and one family left grieving.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

The problem is that the sites that the hon. Member for Leeds East (Richard Burgon) referred to—and there are many others like them—do not necessarily fall into the illegal category, although they still have extremely dangerous and harmful content. Despite a cross-party vote in Parliament to include in the Online Safety Act these very small and very dangerous sites in category 1, there has been a proactive decision to leave them out of the illegal harms codes, which were published yesterday. Can the Minister put on record exactly why that is? Why can these sites not be included in that category? There is all sorts of content glamourising suicide, self-harm, eating disorders and other hate speech that is being promoted by these small sites. They should be regulated to a high level.

Feryal Clark Portrait Feryal Clark
- Hansard - -

Based on research regarding the likely impact of user numbers and functionalities, category 1 is about easy, quick and wide dissemination of regulated user-generated content. As Melanie Dawes set out in her letter to the Secretary of State in September, Ofcom has established a “small but risky” supervision task, as I mentioned, to manage and enforce compliance among smaller services. It has the power to impose significant penalties and, as I say, to take remedial action against non-compliant services. As the hon. Member for Leeds East mentioned earlier, the Online Safety Act is one of the biggest steps that Government have taken on online safety, but it is imperfect. It is an iterative process, and it will be kept under review.

I thank the hon. Gentleman for raising this matter, and for bringing to our memory Joe Nihill and those like him, who turned to the internet for help and were met with harm. On his final point, on the effective implementation of the Online Safety Act, we will continue to engage with all providers in this space. I am confident that these measures are a big step in making tech companies play their part in wiping out those harms and making the internet a safer place for us all. The hon. Gentleman raised the matter of an outstanding question. I do not know whether he has gone to the wrong Department, but I will commit to looking up that question and ensuring that he receives a response to it.

With that, I thank you, Madam Deputy Speaker, and wish you and the whole House a very happy Christmas.

Question put and agreed to.