Internet Service Providers and Suicide-related Content

Wednesday 18th December 2024

(1 day, 18 hours ago)

Commons Chamber
Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Motion made, and Question proposed, That this House do now adjourn.—(Taiwo Owatemi.)
18:34
Richard Burgon Portrait Richard Burgon (Leeds East) (Ind)
- View Speech - Hansard - - - Excerpts

The reason I have sought this Adjournment debate on internet service providers and suicide-related content online arises from a terrible tragedy that happened in my constituency. My constituent Joe Nihill was aged just 23 when he took his own life back in 2020 after accessing a horrific website. The purpose of that website is something that will alarm every Member of this House: it is dedicated to pushing people towards suicide. In fact, the website—which I will not name for reasons of public safety—was the subject of a BBC investigation linking it to more than 50 deaths in the UK, but it is linked to many more deaths around the world. That BBC investigation, which took place a year ago, rightly identified multiple warnings to the UK Government by coroners, and a number of police investigations.

To be clear, this website pushes people to suicide by encouraging suicide, and by actively attempting to dissuade them from seeking mental health support or the support of their family and friends. It provides people with instructions on how to take their own life, it has links to where substances can be purchased, and it has even livestreamed suicides.

Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- Hansard - - - Excerpts

I commend the hon. Gentleman for bringing this debate before the House. I spoke to him before the debate; this is an issue that needs airing, and he is doing us all justice by doing so.

I am aware that some streaming services such as Disney+ will put disclaimers in place for graphic self-harm and suicide scenes. Netflix took a step further: its programme “13 Reasons Why” removed its final episode, as it contained a highly graphic scene of suicide that many found distressing. Does the hon. Member agree that streaming services that screen scenes of suicide must, as an industry standard, have a responsibility to consider the age range of their target audience? What we are asking for tonight is for the Minister and this Government to take action.

Richard Burgon Portrait Richard Burgon
- Hansard - - - Excerpts

The hon. Member is correct that everyone should exercise great moral responsibility when putting stuff out there for people to see and be influenced by.

Joe Nihill’s mother Catherine and his sister-in-law Melanie have run an inspiring campaign in the wake of that tragedy to stop what happened to Joe back in April 2020 happening to other people. Before he took his own life, Joe left a note for his family, and in that note, he asked them to do everything they could to get this website taken down so that others were not pushed down the same path as him. Catherine and Melanie have saved lives as a result of their interventions, personally preventing people from going down that path. What is needed, though, is not the heroism of people such as Catherine and Melanie—it has saved lives, but it is not enough. What is needed is a change in the law. Of course, I welcome the advance made in this regard through the Online Safety Act 2023, which I will turn to later.

Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- Hansard - - - Excerpts

I congratulate the hon. Gentleman on securing this important debate. My constituent David Parfett has been in the news speaking about his son Tom, who sadly took his own life following his visits to a very harmful site—quite possibly the same one that the hon. Gentleman is talking about—that promotes how people can take their own lives. He sourced poison that way and took his own life. There are 97 Britons who have lost their lives after using this website. We need to take action on these very small but very harmful websites. The Online Safety Act contains a provision for such websites to be included in category 1, the most highly regulated category, yet the illegal harms code published yesterday does not include them. Does the hon. Gentleman agree that this is a massive oversight, and that these websites should be included in category 1?

Richard Burgon Portrait Richard Burgon
- Hansard - - - Excerpts

I thank the hon. Member for her intervention, and I will mention her constituent’s horrific experience later in my speech. I agree that there is much further to go to ensure that the Online Safety Act does what it needs to do to protect as many people as possible.

Of this website, Joe’s sister-in-law Melanie said yesterday on social media that

“the problem with these websites is that they are accessed by people at their most vulnerable and children. I’m Joe’s sister in law and I know Joe would still be here if he hadn’t accessed that website because the method he used is only discussed there he wouldn’t have known any other way. These sites are run by people who prey on the vulnerable and say they too are going to end their life but 4 years later they are still here doing the same thing pushing methods. We are never going to end suicide, but we know that so many people can be helped.”

The BBC investigation identified one of the creators of the site, and tracked him down to his home in Huntsville, Alabama in the US. He was doorstepped by the BBC reporter and he refused to answer any questions, but an account associated with this creator of the site issued defiant responses about the UK’s wanting to block the site.

As part of its investigation a year ago, the BBC contacted internet service providers, as did Joe’s sister-in-law and his mother. Sky Broadband, for example, responded by saying that it had blocked the site. Catherine and Melanie said at the time:

“It’s really important to us both, as it means access is becoming limited to prevent others…finding it—which is a step in the right direction.”

The hon. Member mentioned her constituent David Parfett, and David’s son Tom was 22 when he ended his own life in 2021 after accessing this site. Responding to Sky Broadband’s decision as an internet service provider a year ago to block this site, Mr Parfett said:

“It made me cry. It’s pure relief, mixed with anger that Tom may still be here if”

it

“had been regulated two years ago. My sole aim has been to stop other people being influenced to take their own life.”

Responding to a defiant response from the site linked to the founder of the website, Mr Parfett added:

“These people encourage others to die and celebrate death”.

In a statement at the time, Ofcom told BBC News—this was just over a year ago—about the then Online Safety Bill:

“If services don’t comply, we’ll have a broad range of enforcement powers at our disposal to ensure they’re held accountable”.

In a recent Westminster Hall debate, I intervened on the Minister about this, and I congratulated the internet service providers Sky and Three on taking action to block access to this site. The Minister very helpfully welcomed that intervention, and made the important point that

“internet providers do not have to wait for the Act to be enacted; they can start making such changes now.”

She went on to say that

“the Online Safety Act…is a landmark Act, but it is also imperfect. Ofcom’s need to consult means a long lead-in time; although it is important to get these matters right, that can often feel frustrating.”—[Official Report, 26 November 2024; Vol. 757, c. 250WH.]

It is right that internet service providers do the right thing and take responsibility.

Just as Joe’s family have been contacting internet service providers, so have I. I very much welcome the fact that Three has responded to representations by blocking this site, which I will not name, as has Sky. Other responses were not quite as positive or as practical. Vodafone responded by saying that the site is blocked

“where customers have adult content filters enabled”.

BT responded by saying that

“our fixed network level broadband parental control settings for all ages block the site”.

The response from Virgin Media O2 concerned me, and I want to put it on the record. It originally came back to me saying that it would block the site if a court order told it to. We need to be clear that it is not impressive to say, “If a court tells us to do something, we will do it.” A court order is a court order, and companies have no choice other than to comply. Virgin Media O2 also referred to people changing settings so that they cannot access this site. Virgin Media O2 needs to get real. Somebody who is in the mindset of considering taking their own life—somebody who is struggling to control that impulse—is not likely to disable the setting to stop themselves from looking at it.

Scott Arthur Portrait Dr Scott Arthur (Edinburgh South West) (Lab)
- Hansard - - - Excerpts

My hon. Friend is making a powerful speech. I did not come here to speak, but he is discussing a key topic. As we move into Christmas, many of us are looking forward to it, but it can be a low time for people. I worry about people accessing this content and content around eating disorders. The question for any internet service provider—hopefully they are watching this debate—is, what possible justification can they have for continuing access to this site? Are they hiding behind freedom of speech? To me, there is a complete imbalance between the need to protect the rights of these young people and the wider freedom of speech argument.

Richard Burgon Portrait Richard Burgon
- Hansard - - - Excerpts

I could not agree more with my hon. Friend. This is not a freedom of speech issue; this is a particular website linked to the deaths of 50 people in our country and many more worldwide.

In its reply, Virgin Media O2 also said that it was handling this matter through its partnership with the Internet Watch Foundation. I contacted the Internet Watch Foundation, and it replied that

“we work with companies to block child sexual abuse material specifically, so don’t work on suicide related content I am afraid”.

It was therefore a poor reassurance from Virgin Media O2 to point to a partnership with an organisation that does great work, I am sure, but not in relation to this specific issue.

I pressed Virgin Media O2 further, and it said:

“We will review the specific website you raised with us and consider if further action should be taken”.

Of course further action should be taken. There are technological limits that sometimes mean a block cannot be 100% effective, but lives can be saved and will be saved by restricting the number of people who access this site.

I put on record that I have had no answer from EE. It should answer, and it should act. I encourage all internet service providers to do the right thing and, in whatever way they can, to block this specific site, which is linked to 50 UK deaths, is the subject of police investigations, as we understand it, and is referred to in various coroners’ reports.

To give a sense of the scale of the challenge, Three UK has kindly provided me with data today that shows that it has blocked 10,025 attempts to access URLs that it has categorised under suicide and self-harm in the past month alone. Three UK should be congratulated on what it has done. The fact that it can inform me of the number of attempts to access such sites that it has blocked shows why it is fundamentally necessary for other companies to do the right thing.

The site is hosted by Cloudflare, a major company with a good reputation and a corporate office in London. I draw the House’s attention to a written question asked by the right hon. Member for Goole and Pocklington (David Davis), who I emailed earlier about this. On 24 October 2023, he asked a written question that was passed to the Home Office. It said:

“To ask the Secretary of State for the Home Department, whether her Department has held recent discussions with Cloudflare on removing the website linked to deaths by suicide reported on by the BBC on 24 October 2023.”

He was asking what the then Government had done to pressure Cloudflare, which hosts this site, to take it down and disrupt its operation. No answer was given to that. He is still awaiting a response to that question that was due an answer on 31 October 2023.

On 29 May 2024, I wrote to the chief executive officer of Cloudflare, Matthew Prince, making it clear what had happened in this situation. I said:

“The reason I am writing to you today is because it appears your company is hosting this website and I would like to draw this to your attention so you can terminate your hosting of this site, to protect the public in both our countries”—

the USA and the UK—

“and across the world. I know a successful company of over a decade’s good standing like Cloudflare with an excellent reputation, would not wish to be associated with such harmful content, linked to the deaths of many vulnerable people across the world.”

I detailed the whole matter, as I have detailed it to the House, and then I put:

“I would be very grateful if you look into this matter as a matter of urgency before any more vulnerable people are encouraged or enabled to harm themselves due to this website’s activities. Cloudflare ceasing to host its website would not be a contravention of the principle of freedom of speech but a choice of a reputable and respected company not to give a platform to a website which has been linked to the death of 50 people in the UK alone. Such a decision by Cloudflare could well save lives.”

I said:

“It should be noted that both Sky Broadband and 3 mobile have blocked access to this website”.

I got no response to that letter on a really serious matter. I hope not only that internet service providers will do the right thing, but that the major company Cloudflare will do the right thing and stop hosting this website. Disrupting its operation in that way could save lives, and I believe that it would save lives.

To conclude, I will ask the Minister, who has been doing a fantastic job on these sensitive issues, a number of questions. Will she congratulate those internet service providers who have done the right thing in taking action to block this site? Does she agree that those who have not should step up to save lives? Will she assure me that once Ofcom’s powers are fully enacted, the Online Safety Act 2023 will deal with this specific site regardless of the number of people who access it and whether those people are under or over 18?

I find it frustrating when internet service providers get back to me and refer to child protection. My constituent Joe was 23 when he took his own life, and the constituent of the hon. Member for Twickenham (Munira Wilson) was 21 or 22 when he took his own life, so it is ridiculous to assume that harmful suicide-related content of this type is only a danger to people under 18.

In relation to the question to the Home Office tabled on 24 October 2023 by the right hon. Member for Goole and Pocklington, will the Minister take action to ensure that her Department answers that question? Will she agree to the Government contacting Cloudflare as the host of the site and raise concerns and make representations? We are talking about the deaths of 50 citizens in our country.

I will finish my remarks by again paying tribute to Catherine and Melanie, Joe’s mother and sister-in-law. They have been navigating this complex, ever-changing world of dangerous activities that go on online and their actions have saved lives. They have been struggling to do so against great odds—it sometimes feels like a David and Goliath situation.

I note that when people are, for example, illegally streaming football matches, action is taken very quickly, yet this website, which is linked to the deaths of 50 people, remains up there. I look forward to the Minister’s response and thank Members for attending the debate.

18:54
Feryal Clark Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Feryal Clark)
- View Speech - Hansard - - - Excerpts

I thank the hon. Member for Leeds East (Richard Burgon) for opening the debate and all other colleagues who have contributed. I know that this issue will be close to the hearts of many of us, because it is about protecting the safety of everyone, including our children and young people.

This evening I want to talk about why this issue matters and what the Online Safety Act will do about it. First, I would like to share my deepest sympathies with family and friends of Joe Nihill—a 23-year-old man who ended his life after finding suicide-related content online. Unfortunately, stories such as Joe’s are not uncommon—we have heard about Tom, a 22-year-old young man, who also died from suicide. As part of our work in online safety we speak to groups that have campaigned for years for a safer internet, often led by bereaved families. I thank Joe’s mother Catherine, his sister-in-law Melanie and all the bereaved families for their tireless work. We continue to listen to their expertise in this conversation.

People who are thinking about ending their lives or hurting themselves might turn to the internet as a place of refuge. All too often, what they find instead is content encouraging them not to seek help. That deluge of content has a real-world impact. Suicide-related internet use is a factor in around a quarter of deaths by suicide among people aged 10 to 19 in the UK—at least 43 deaths a year. Lots of research in this area focuses on children, but it is important to recognise that suicide-related internet use can be a factor in suicide in all age groups. These harms are real, and tackling them must be a collective effort.

On the hon. Member’s first point, we welcome efforts by all companies, including internet service providers, to tackle illegal content so that no more lives are tragically lost to suicide. Online safety forms a key pillar of the Government’s suicide prevention strategy. However, we are clear that the principal responsibility sits squarely with those who post such hateful content, and the site where it is allowed to fester—sites that, until now, have not been made to face the consequences. The Online Safety Act has been a long time coming. A decade of delay has come at a tragic human cost, but change is on its way. On Monday, Ofcom published its draft illegal harms codes under the Online Safety Act, which are a step change.

On the hon. Member’s second point, I can confirm that from next spring, for the first time, social media platforms and search engines will have to look proactively for and take down illegal content. These codes will apply to sites big and small. If services do not comply they could be hit by massive fines, or Ofcom could, with the agreement of the courts, use business disruption measures —court orders that mean that third parties have to withdraw their services or restrict or block access to non-compliant services in the UK. We have made intentionally encouraging or assisting suicide a priority offence under the Act. That means that all providers, no matter their size, will have to show that they are taking steps to stop their sites being used for such content.

The strongest protection in the Act’s frameworks are for children, so on the hon. Member’s third point, I assure him that under the draft child safety codes, any site that allows content that promotes self-harm, eating disorders or suicide will now have to use highly effective age limits to stop children from accessing such content. Some sites will face extra duties. We have laid the draft regulations setting out the threshold conditions for category 1, 2A and 2B services under the Act. Category 1 sites are those that have the ability to spread content easily, quickly and widely. They will have to take down content if it goes against their terms of services, such as posts that could encourage self-harm or eating disorders. They will also have to give adult users the tools to make it less likely they will see content that they do not want to see, or will alert them to the nature of potentially harmful content.

A suicide forum will be unlikely to have terms of services that restrict legal suicide content, and users of these sites are unlikely to want to use tools that make it less likely they will see such content. However, that absolutely does not mean that such forums—what people call “small but risky” sites—can go unnoticed.

18:59
Motion lapsed (Standing Order No. 9(3)).
Motion made, and Question proposed, That this House do now adjourn.—(Taiwo Owatemi.)
Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

Every site, whether it has five users or 500 million users, will have to proactively remove illegal content, such as content where there is proven intent of encouraging someone to end their life. Ofcom has also set up a “small but risky” supervision taskforce to ensure that smaller forums comply with new measures, and it is ready to take enforcement action if they do not do so. The Government understand that just one person seeing this kind of content could mean one body harmed, one life ended, and one family left grieving.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

The problem is that the sites that the hon. Member for Leeds East (Richard Burgon) referred to—and there are many others like them—do not necessarily fall into the illegal category, although they still have extremely dangerous and harmful content. Despite a cross-party vote in Parliament to include in the Online Safety Act these very small and very dangerous sites in category 1, there has been a proactive decision to leave them out of the illegal harms codes, which were published yesterday. Can the Minister put on record exactly why that is? Why can these sites not be included in that category? There is all sorts of content glamourising suicide, self-harm, eating disorders and other hate speech that is being promoted by these small sites. They should be regulated to a high level.

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

Based on research regarding the likely impact of user numbers and functionalities, category 1 is about easy, quick and wide dissemination of regulated user-generated content. As Melanie Dawes set out in her letter to the Secretary of State in September, Ofcom has established a “small but risky” supervision task, as I mentioned, to manage and enforce compliance among smaller services. It has the power to impose significant penalties and, as I say, to take remedial action against non-compliant services. As the hon. Member for Leeds East mentioned earlier, the Online Safety Act is one of the biggest steps that Government have taken on online safety, but it is imperfect. It is an iterative process, and it will be kept under review.

I thank the hon. Gentleman for raising this matter, and for bringing to our memory Joe Nihill and those like him, who turned to the internet for help and were met with harm. On his final point, on the effective implementation of the Online Safety Act, we will continue to engage with all providers in this space. I am confident that these measures are a big step in making tech companies play their part in wiping out those harms and making the internet a safer place for us all. The hon. Gentleman raised the matter of an outstanding question. I do not know whether he has gone to the wrong Department, but I will commit to looking up that question and ensuring that he receives a response to it.

With that, I thank you, Madam Deputy Speaker, and wish you and the whole House a very happy Christmas.

Question put and agreed to.

18:59
House adjourned.