Read Bill Ministerial Extracts
(2 years, 5 months ago)
Lords Chamber(1 year, 9 months ago)
Lords ChamberThat the Bill be now read a second time.
My Lords, I am most grateful to the Samaritans for all its help with this Bill, and to Papyrus, YoungMinds, the Mental Health Foundation, the British Psychological Society, If U Care Share and others for their support. I am also grateful to the Library for updating its full briefing.
The original Second Reading of this Bill was cancelled due to the sad death of Her Majesty the Queen. It now falls between Second Reading and Committee of the Government’s Online Safety Bill. In the spirit of co-operation called for by the noble Lord, Lord Stevenson of Balmacara, on Wednesday evening, I hope today’s debate will help identify how the principle of my Bill could improve the Online Safety Bill. My Bill would create a duty on Ofcom that complements the Online Safety Bill. In practice, this means that Ofcom would need to assess how prevalent self-harm and suicide content is online, and whether the legislative regime is well-equipped to protect individuals from being exposed to and fed excessively harmful content.
Why did I table this Bill? In 2021, 5,583 people in England and Wales took their own lives. Suicide is complex, rarely caused by one thing and cuts across all age groups. A University of Bristol study found that participants with severe suicidal thoughts actively used the internet to research an effective method, and often found clear suggestions. We must recognise that the smaller platforms—not just category 1 or 2A platforms—have some of the most explicit and harmful details.
Self-harm signals serious emotional distress and is a strong risk factor for future suicide, although fortunately most people who self-harm will not go on to take their own life. For 20 years, self-harm rates have increased, particularly among young people, and have more than doubled in England since the turn of the millennium. Among those surveyed by Samaritans, three-quarters had harmed themselves more severely after viewing self-harm content online. Some 78% of people with lived experience of suicidality and self-harm want new laws to make online spaces safer. The internet can be an invaluable space for individuals to access support and to express difficult feelings, but its algorithms can also barrage people with content that encourages or exacerbates self-harm and suicidal behaviours.
The Law Commission’s 2021 report on modernising communications recognised the need to tackle “legal but harmful”. The Online Safety Bill as now written contains two cliff edges: one is the chronological age of 18; the other is the point that content is defined as illegal. The latter is not as easy as it might seem. Section 59 of the Coroners and Justice Act 2009 states that a person commits an offence if they intentionally undertake an act
“capable of encouraging or assisting the suicide of attempted suicide of another person”,
yet no prosecution from online advancement has been brought. Does it relate to the burden of proof required?
In the gap between these two cliff edges of age and illegality sits the thorny issue of “legal but harmful”. My Bill would require Ofcom to establish a unit to advise government on the extent to which social media platforms encourage self-harm or suicide, advise on the effectiveness of current regulations and make recommendations. This would support suicide prevention strategies across public health and education.
Last summer, we heard about ligature challenges so harmful that youngsters died or were brain damaged. Now, the virtual reality environment, the metaverse, simulates a real-world arena for practising offending behind closed doors—a pathway to real-life abuse.
Clause 2 recognises that people react in different ways to what they find online, so what is harmful to one person is not harmful to another. What matters is whether the information is posted or sent with malicious intent, without reasonable excuse. What can be the justification for flooding people with ever more violent, disturbing images, other than profit? No one can pretend that that is providing support.
The Government’s decision to remove regulation of legal but extremely harmful content is a backward step, given that susceptibility to harm does not end when people reach the age of 18. This will leave huge amounts of dangerous content widely available of instruction on methods, and pushed content, portraying and romanticising self-harm and suicide as positive and desirable. New research commissioned by the Samaritans found that the Government’s removal of protection of over-18s from damaging content goes directly against what the public want. Four in five—83%—agree that harmful suicide and self-harm content can have a damaging effect on adults, not just children. Less than one in six think that access should be restricted only for children. Removing the regulation of legal but extremely harmful content means that platforms will not need to consider risk to adult users or victims. Although platforms will need to provide empowerment tools for such content, these will not protect the vulnerable users who are already drawn to or sucked into damaging content.
The creation of the new offence of encouragement or assistance of serious self-harm should be introduced in time to be listed as priority legal content within the Online Safety Bill. It needs to be drafted narrowly, so that at-risk individuals and charities providing self-harm services are not criminalised. As the noble Lord, Lord Sarfraz, said at Second Reading of the Online Safety Bill,
“we cannot always play catch-up with technology.”—[Official Report, 1/2/23; col. 762.]
Technologies are emerging faster than we can imagine and can assist in plugging the gap of so-called legal but harmful. It will be the only way to make the internet safer, rather than a playing field for those of mal-intent who profit from exploiting the vulnerabilities of people.
We need completely different approaches from those of film or television classification because material is constantly being posted on the internet, and no human being can keep up with that. Generic approaches must set standards against which monitoring can occur so that risk of harm is minimised. That will involve engaging with highly sophisticated techniques in artificial intelligence, not crude algorithms, while accepting that artificial intelligence will make mistakes just as humans do, and that the accuracy depends on the way that screening mechanisms are trained.
In preparing for the Bill I asked the question: “How could AI filter out harmful content on the internet?” I got the reply that AI can filter out harmful content by using various techniques, such as natural language processing, image recognition, video analysis and machine learning. With this came the statement that
“it is important to note that A I is not perfect and can still make mistakes. It is crucial to have human oversight and review of AI generated results to ensure the accuracy and fairness of content filtering.”
I then asked: “How accurate is AI? Could it accidentally remove content that is not harmful?”, to which I received the response that the accidental removal of content that is not harmful can happen for several reasons, including bias in training data, ambiguous content and false positives. As well as needing human oversight, I was told that:
“It is also important to continually evaluate and improve AI models to reduce the risk of mistakes.”
It was an AI chatbot that gave me those answers, in seconds.
I also asked the site to write a short speech about my Bill. The result would have been rather good for a school debate—I fear that some of your Lordships might even have thought it better than my speech today. Yesterday’s science fiction is here today. I beg to move.
My Lords, I intervene to give my strong support to this Bill, which is a good step forward. I hope, in view of the debate we had the other night about the Online Safety Bill, that we will be able to meld these difficulties together into one Bill, but if we cannot, this is a good step forward.
I have come into this area only tangentially in that, when I was a member of the Council of Europe, I was involved in an assessment of AI and the uses to which it could be put. This ties in very much with the latter part of the noble Baroness’s speech, because what comes out of the box has to be put into the box. We were studying sentencing and whether you could use AI to sentence prisoners. Believe it or not, that has been tried in the State of Florida. Analysis afterwards showed quite clearly that the people who were inputting the information had a bias which they were not necessarily aware of but which was making the sentencing unbalanced. In fact, it was making it more likely that black people in Florida would be sentenced to longer prison terms, and more likely that they would be found guilty and sentenced than white people. However, when the researchers we employed went back to dig out the data, they did not find any bias in the people who were inputting it. In other words, there was no deliberate attempt to bias the data; it had all come about because of the unconscious bias which we probably all have buried within us. Therefore, we need to be very careful in this area.
It is a good step forward for Ofcom to set up a group to look at suicide and what it can do to address it—I am pleased to see that. However, I disagree marginally with the noble Baroness. It is not just about profit. One of the problems with the internet is the mental health issues of the people posting the information. We saw this the other day when I referred to the showing of the images given to the unfortunate little girl who took her own life. No one was actually making any money out of it, but they were undoubtedly getting psychological thrills from causing deep pain and harm. This is one of the things we have to address: it is not just about money, and in many cases, this is what happens on the web.
I am no expert on the web—in fact, at home I am a bit of a joke because of my lack of knowledge of how to navigate it—but what I have seen shows me that serious steps need to be taken. As I said in my speech the other night, we have to tackle vigorously the concept of anonymity on the web. There should be a way of tracing what is being posted and who is posting it, so that regulators and, if necessary, the police, can quickly get to the source. I made the point the other night—I will make it again—that the more anonymous a posting can be, the more unacceptable the sentiments in it often are. We are going to have to tackle this question of anonymity.
In my lifetime, I have known three people who ended their lives by committing suicide. None of them were children. For two of the three, it was completely un-expected. The only thing that could be said afterwards —of course, all the inquiry and debate comes afterwards— is that they had felt very isolated in facing up to the problems of their lives. It brought to mind the case, for those noble Lords who remember, of Dr David Kelly, who killed himself. His case was undoubtedly affected by his familial relations and the fact that he did not feel he had the level of support he needed.
The third person I knew who committed suicide suffered from deep mental depression. She was an Oxford graduate so she was not someone at the margin of life. She had a good degree and held a good job but she went into a spiral of depression to a point where, as one of my friends said, “She just won’t be helped, will she?” It was very sad but no one knew what to do. Other than locking her up in a secure room and keeping a watch on her, we probably could not have prevented her suicide. It was something that, I am afraid, a number of us expected to happen but were helpless in trying to prevent—although a number of us did try to prevent it by getting social services and mental health services involved. One of the lessons we must learn is that, for some people, the mental state into which they get is very difficult to help with. It is no good blaming the National Health Service for it. The health service is terribly overstretched and there is a limit to what it can do.
That has been a bit of a diversion on this excellent Bill. My final point concerns the words
“sent or posted with malicious intent”.
It is going to be very difficult to prove that. The definition needs to be tightened up and turned into something more like “sent or posted with apparent malicious intent” because other people have to judge it. It is no good some bright little person sitting there and saying, “Oh, I didn’t realise that anyone would kill themselves because of this. I was just playing around.” This offence probably needs to be tightened a little bit; we are going to have to rely on a certain amount of judge-made law to interpret how “malicious intent” is to be registered and understood.
Having said all that, I welcome the Bill. It is an excellent step forward from a most hard-working Member of this House, with whom I have had a lot to do over the years. We are very lucky that she is here. I wish her Bill well and I am sure that the Minister will do his best to help.
My Lords, I congratulate the noble Baroness, Lady Finlay of Llandaff, on her choice of subject for this Private Member’s Bill and her success in the ballot to bring it before your Lordships today. She has made the case for the Bill clearly so I will add just a few remarks. In so doing, let me say that it is a pleasure to follow the noble Lord, Lord Balfe; I agree with him that this excellent Bill is a good step forward.
The importance of the internet and social media platforms in education is well known and acknowledged. However, what educators must know—they want and need to know this—is that, if they recommend the use of social media platforms, they will not be putting children and young people in harm’s way, in particular because of the algorithms and artificial intelligence in use. We know that many young people and children are living tough lives at present so our role as legislators must be to offer them all possible protections, both in real life and online, on all aspects of social media; of course, that goes for not just young people but adults too.
Earlier this week, along with the noble Baroness, Lady Finlay, and the noble Lord, Lord Balfe, I had the experience of viewing what can only be considered material that obviously promoted self-harm, even suicide. All of us in this Chamber are acquainted with the tragic case of Molly Russell. Her father has described her on many occasions as having shown no signs of mental ill-health, yet she took her own young life after viewing an incredible volume of graphic material promoting self-harm and suicide. Social media platforms require regulation to prevent the volume of material promoting self-harm that is currently so easily accessible and available.
According to the excellent Library briefing, Ofcom has found that 64% of parents are concerned that their children will see content that might encourage them to harm themselves, with this concern highest among the parents of eight to 11 year-olds. We must act. This Bill from the noble Baroness, Lady Finlay, gives us that opportunity. I offer it my full support; I really hope that the Government will support it too.
My Lords, I support the Bill. I congratulate the noble Baroness on bringing it to the House and on her passionate, common-sense opening speech.
In September 2022, the 3 Dads Walking—Andy Airey, Tim Owen and Mike Palmer—set off from Belfast on their second walk, which was part of a month-long, 600-mile trek between all four parliaments of the UK to raise awareness of suicide prevention across the country. They are only too aware of the influence that the internet can have on vulnerable young people. Their mission to raise awareness started after losing their beautiful daughters, Sophie, Emily and Beth, to suicide.
Before this trek across the country, the 3 Dads had previously walked between their homes, from Cumbria to Manchester to Norfolk. During those walks, they heard stories from so many parents and young people about the influence that the internet had on their loved ones in making that tragic decision to take their own life. To think that those young people could have been encouraged to self-harm, and ultimately take their own lives, through social media and the internet is unforgivable. It is totally unacceptable that vulnerable young people can be encouraged so readily into suicide, can research suicide methodology and can easily access the tools to take their own lives.
Regrettably, this is a story that the 3 Dads have heard many times. I have heard the same tragic tale from both parents and teachers who are involved in counselling children and young people in schools. Social media and internet search engine companies have a duty of care to their users. Positive signposting should be the norm. A search on the internet for suicide or self-harm should result in positive signposting to available help, not to the detail to which many search engines and social media platforms currently direct the user. We have to acknowledge that suicide prevention across society is complex but it is something we need to invest in.
We must not accept that suicide is the biggest killer of the under-35s and do nothing to prevent it, or turn a blind eye to the astonishing fact that over 200 schoolchildren take their own lives every year. What has society come to? There must be education in schools about this issue and about the consequences, and to give young people hope. I hope that the Online Safety Bill, which is now being debated in this House, will also play its part by bringing in legislation to safeguard and protect children and young people. That is so necessary.
This is a generation that has grown up around the internet, and as decision-makers we must do everything in our power to make that environment as safe as possible. I passionately believe that this Bill, together with suicide prevention being taught to kids in school and robust measures in the Online Safety Bill, would be a step in the right direction. Andy, Tim, Mike and I wholeheartedly support this Bill, as it will consider and protect vulnerable young people. Most of all, it will save lives.
My Lords, I welcome the opportunity to speak in this debate and to support my noble friend Lady Finlay in her work. This is a valuable opportunity to cover some of the issues that cut across this Bill and the Online Safety Bill, and how they complement each other. I spoke on the Online Safety Bill earlier this week and found it an emotional experience, as many in your Lordships’ Chamber did, but that shows how important both Bills are. I also thank the Minister, who we all know has had a very busy week.
Social media, at its best, is incredible. It has helped me in my work here. People listening to debates have sent me briefing notes. People have helped me to navigate train cancellations. One night, leaving your Lordships’ Chamber very late, I posted that I had missed having anything to eat, and had people offering to bring me pizza at Peers’ Entrance, offering me access to their homes to cook me food and, when I got back to where I was staying at the time, someone had left a cheese sandwich outside my door. It was truly lovely.
However, we are a very long way away from when social media seemed to be about posting pictures of cute cats. Now, sadly, it has become a very dark place, where images, push notifications and disturbing content can be found all too easily. It circles back around very quickly as well. For all the good and bad that it can bring, it does sometimes feel that we are shouting into a void, where perceptions and misconceptions can be validated by someone, sometimes many times. As I stated earlier in the week, I do not want to stifle free speech on social media. I follow people whom I strongly disagree with, but it is important to be able to sense check your views. However, we must now look at drawing a line in the sand. The powerful speeches that we heard this week about the dangers that exist, and the tragic case of Molly Russell, bring into stark reality that we must do more than we are currently doing.
What we saw at the meeting organised by my noble friend Lady Kidron was graphic and appalling. We know that social media can be a rabbit hole, and never more so than when we were in lockdown, with daily routines completely upended. On the back of the pandemic and lockdown, we are seeing the long-term impact on mental health and well-being. This needs to be considered. It is not surprising that so many people had suicidal thoughts. We must find positive solutions to deal with this.
I briefly mentioned on the Online Safety Bill that the triple lock is not enough. I did not discuss legal but harmful. I do not think that we should have one rule for what is illegal in the real world and one for what is illegal online. One of the challenges is that some people are finding it harder and harder to differentiate between the two, especially as the technology develops that blurs those lines. However, away from the internet and in real life, the ability to access potentially damaging information is very different. In real life, you do not have constant push notifications or algorithms thrusting this data at you. Therefore, we must explore this further through both Bills.
I thank Samaritans for its briefing on this Bill, which has been extremely useful, and mention the Swansea University research, which shows that three-quarters of the people who harmed themselves did so more severely after viewing self-harm online. To end on a more positive note, it was wonderful this week to hear so many noble Lords talk about this not being a party-political issue. It is not. In that spirit, we should take all the good from this Bill and work with the Online Safety Bill to really protect internet users.
My Lords, I too am very grateful to the noble Baroness, Lady Finlay, for introducing this Private Member’s Bill, with supplements the lengthy Online Safety Bill that your Lordships’ House discussed earlier this week. That Bill would set up Ofcom as an online safety regulator.
At first, I thought that this Bill was “getting on the front foot” legislation, but it is more aptly “keeping us on the front foot” legislation, when arguably we have been on the back foot for so long. It is not about censoring content before it is online but about ensuring that Ofcom is keeping the Government, Parliament and the public up to date with what is happening online in terms of self-harm and suicide content.
The Bill would ensure that the Government get both advice on the effectiveness of regulations and recommendations from Ofcom. Importantly, it would ensure that we do not get into a stop-start pattern of reviews when we have cases of self-harm and suicide. Reviews are often triggered only by a terrible tragedy and the comments of the coroner. That puts real pressure on a family and puts them through additional pain. If the Government knew that Ofcom had this role of recommendation and monitoring content, then it would be the body that they would go to and there would be a regular pattern of reporting to government. We know that the internet and technology are always developing, so we need a vehicle to keep us abreast of this.
When we legislate, I always look for precedent and analogy. This role for Ofcom would be akin to the role that the Advisory Council on the Misuse of Drugs has in relation to the Home Office. That council keeps under review the situation of drugs which appear to be being misused. We saw it respond nimbly to the swift development of legal highs by establishing the novel psychoactive substances committee. In that context, the Government cannot wait for legislation or statutory instruments to deal with these fast-changing chemical developments. The body proposed in the Bill would enable us, to some extent, to keep pace with developments on the internet.
I understand that His Majesty’s Government have committed to introducing an additional offence of encouraging and assisting self-harm. When it comes to the notices and penalties under the Online Safety Bill, obviously some firms will have our best lawyers looking at cases. I am not in that category, but might there be arguments about whether self-harm, with “self” meaning “the human person”, would cover content that uses humanoids? It could be argued that they are not too much like human beings at the moment, so putting that kind of content online could not possibly encourage someone to self-harm. However, as they and the evidence on our human response to seeing humanoids through our phones develop, they might be found to encourage self-harm. It is on that kind of development and the evidence behind it that we need recommendations as to whether we should change what the Online Safety Bill covers.
It would also be useful to monitor this content because it will ensure that Ofcom reports to us on what content it feels is within the Online Safety Bill and what content it has decided is outside it. Ofcom may come to us with more recommendations for the Government to consider whether that content should be brought from beyond the Online Safety Bill and into its coverage. However, only if we see this monitoring by Ofcom, as suggested in this Bill, can the Government and Parliament be properly equipped to achieve His Majesty’s Government’s intention of making Britain the safest place to be online in the world.
My Lords, first I will apologise for being late to this debate—five seconds, according to the annunciator. AI assisted me in getting here, because my Fitbit is synced to my phone and there was a message from the Whips saying, “Get in here fast”, so I got here as quickly as I could. Clearly, the previous Private Member’s Bill moved rather swiftly. This one is very important, as are all Private Members’ Bills, and it necessitates a lot of reflection.
This morning, I would like to take noble Lords back to an earlier era, long before the internet. From looking around the Chamber, I think most but not all noble Lords remember life before the internet.
I want to tell your Lordships a story about Eileen. Eileen was 11 when her father died, and she was very close to her father. She was 17 when her eldest sister died; it was a sudden and unfortunate death, and Eileen descended into difficulties with mental health and anorexia. The anorexia persisted from the age of 17 until she was 40. She married and had a child, so she managed to function, but, at some point, the daughter came home to find that her mother had been taken to hospital with an overdose. It was never clear to the daughter whether the overdose was intentional or not. Her mother survived and, at that point, got appropriate treatment.
Fast forward almost 40 years. When the mother was lying on her deathbed with COPD, caused by chronic smoking and addiction, she apologised to the daughter, and by extension to her ex-husband, for the difficulties that she had put her family through. She said, “I knew I wasn’t going mad, but I felt as if I was going mad. The only way I could cope, until I saw a psychiatrist who knew how to help me, was by waiting, counting the minutes until I could have my next cigarette.” In those days, there was no internet, just television and film advertisements for the tobacco industry, which was legal but clearly harmful. This is about addiction.
If Eileen had been born in the age of the internet, she would not have been waiting for the next cigarette, which she would light herself. She would have been impacted by internet sites and algorithms because, as soon as she started seeing things on the internet, there would be a push factor. You need to look at only one internet site for the algorithms to kick in.
Before the debate, I looked at academic research on eating disorders and the internet. I randomly clicked on a report from 2012. The author, Dr Emma Bond from the then University Campus Suffolk—which is not a campus I had heard of—produced a report funded by the Nominet Trust that looked at only 126 websites that are pro-ED and pro-ana. “Pro-ana” internet sites support anorexia. They do not support victims of anorexia or purport to help young people who have anorexia; they glorify anorexia and eating disorders. That was a study into 126 websites 10 years ago, but that was not the sum total of relevant websites; these were only the websites that did not have passwords or were not in the dark web. These were easily available, open-source internet sites.
We have all heard of Molly Russell and the cases that my noble friend Lady Benjamin referred to earlier. The internet can be a source of good or it can be a source of real difficulty for people—those who are most vulnerable or are at risk of addiction. The algorithms are potentially very dangerous, so it is incredibly important to put this legislation on the statute books. I know from the Library briefing that the noble Baroness, Lady Finlay, has suggested that her Private Member’s Bill could also be taken up as an amendment to the Online Safety Bill. If that were possible, it would be welcome. Perhaps the Minister could explain whether the Government are open to such an amendment.
Before I sit down, I should declare the interest that Eileen was my mother.
My Lords, I follow a very moving speech from my noble friend Lady Smith. As many noble Lords have, I welcome this valuable chance to follow up on some of the issues that were raised at the Second Reading of the Online Safety Bill this week. I thank the noble Baroness, Lady Finlay, for the excellent and comprehensive introduction to her Bill and other noble Lords who have shared the concerns and supported the Bill so eloquently.
As many have said, the scale of the issue is clear. Ian Russell, who attended every minute of the Second Reading debate on Wednesday, has the admiration of the House. There were many references to him in the debate, and his testimony is damning and shocking. Many noble Lords who are in the Chamber now or were here on Wednesday were there for his presentation of the thousands of posts that were made to his daughter Molly, before her death, which encouraged self-harm and suicide. Many who have been involved in online safety since the Green Paper and before were shocked. Even those who had been inured to issues of the internet were utterly shocked by the sheer scale of the messaging—thousands and thousands across every platform to which Molly had access.
So I welcome the promise of a new offence but, as mentioned by the noble Baroness, Lady Berridge, and by my noble friend Lady Smith in relation to eating disorders, under the Online Safety Bill, only content that is illegal will be properly caught when this is applied to adults. As has been pointed out by a number of noble Lords, particularly the noble Baroness, Lady Finlay, there is a cliff edge between childhood and adulthood, and we are going to treat 18 year-olds as adults from the day they turn 18. That cannot be right in these circumstances, without involving further risk assessments, protection and monitoring—which this Bill would provide. This is as a result of some very recent changes to the Online Safety Bill. As the noble Baroness, Lady Finlay, our briefing and the Samaritans’ briefing cogently describe, basically it is as a result of deleting the duty to have a risk assessment of legal but harmful content. The Online Safety Bill has been watered down; there is no doubt about that.
The proposals of the noble Baroness, Lady Finlay, are modest. I hope she also tables them as an amendment to the Online Safety Bill in Committee. As the Minister and his department have heard very eloquently from the noble Lord, Lord Stevenson, and from around the House, this is very much something that we want to get right on a cross-party basis. I hope that they take on board the proposals in this Bill, having heard the voices on Wednesday and from around the House today.
In essence, the Bill gives Ofcom a duty to devote resource—and my noble friend Lady Benjamin quite rightly talked about investment—as, under the current form of the Online Safety Bill, it would not have a duty to monitor this kind of content and advise on the effectiveness of current regulation and what needs changing in light of the harm being caused.
In the light of the evidence we have heard and the fact that in the Bill, as it currently stands, there is not even the duty of risk assessment for category 1 content of this kind, this seems the bare minimum that the Government can agree to. This is an effective way of future-proofing the Bill, which, as we heard today and on Wednesday, is absolutely necessary. We cannot keep playing catch-up with the technology and the harms that it can create. I will resist the temptation to digress on the many risks and opportunities that new technology, AI and algorithmic systems can create, but I thought the noble Baroness’s closing statement that yesterday’s science fiction is here today is absolutely apposite. Our regulation absolutely needs to take account of this, so we on these Benches thoroughly support the noble Baroness’s Bill.
My Lords, I am most grateful to all noble Lords who have spoken today for their wisdom and their feeling by bringing into the Chamber the names of those who took their lives. In so doing, we honour their memories and, I hope, strengthen our resolve to do what we can to get this legislation right in considering both the Private Member’s Bill today and the Online Safety Bill. I cannot quite find the words, but I wish to acknowledge warmly the particular openness and bravery of the noble Baroness, Lady Smith, in what she said today.
I congratulate the noble Baroness, Lady Finlay, on —as ever—bringing a valuable focus and a very practical approach to our deliberations in this area. We could say that it is overdue or very timely. I will go with very timely, bearing in mind that we have rightly given very detailed consideration to the Online Safety Bill this week on Second Reading.
Perhaps I can give some additional context, which it is important to reflect on. Suicide is the leading cause of death in males over 50 years old and females under 35 years old. More than 5,500 people in England and Wales tragically took their lives in 2021. These figures show the largest increase in suicide for females under 24 since records began. Self-harm, a strong risk factor for future suicide, has also increased among young people since 2000 and is more common among young people than any other age group. It is important to acknowledge that the impact of suicide is not just on those who tragically take their own lives but courses through the lives and well-being of many communities and those who knew, loved and cared for those people, who felt they had only one tragic option before them.
As we have heard today, the internet can be an invaluable space for individuals who experience self-harm and suicidal feelings. It provides opportunities for users to speak openly and access support, but it can also provide access to content that will act to encourage, maintain or exacerbate self-harm or suicide. As the noble Baroness, Lady Benjamin, said, although the reasons for suicide and self-harm are complex, and they are rarely caused by one thing, it is a fact that, in many cases, the internet is involved. I, too, am grateful to the Samaritans, whose research showed that at least one-quarter of those who self-harmed with high suicidal intent had used the internet in connection with their self-harm.
As my noble friend Lady Blower said, social media platforms are sources of learning, advice and support for their users, particularly young people and children, and are to be valued for that very purpose, but we have heard today, rightly repeatedly, about the case of Molly Russell, who killed herself at the age of 14 having viewed graphic images of self-harm and suicide on a social media platform. We need to reflect that the coroner ruled that the content that Molly had viewed related to depression, self-harm and suicide, and it had contributed to her death in more than a minimal way. As the noble Lord, Lord Clement-Jones, has just reminded us, many noble Lords attended the meeting this week at which we were honoured, if that is the right word, to have Molly’s father join us in our deliberations on the Online Safety Bill. At that meeting, which was also attended by the family’s solicitor, the images that were shown were shocking in their scale and effect, and I know that many noble Lords remain deeply impacted by them.
Research from Ofcom last year showed the extent of the scale that we are dealing with. One-third of children aged between five and seven use social media, and that rises to 97% of young people aged 16 to 17. We need to work not only with young people but with their parents, because many parents are anxious that they are not able to assist and equip their children to deal with the potential harms of social media. The Private Member’s Bill introduced by the noble Baroness, Lady Finlay, addresses an important point: how do we make online protections work? How do we keep them under review?
It has already been indicated that perhaps the aims of her Private Member’s Bill could be achieved through an amendment to the Online Safety Bill, and that in debate on the Online Safety Bill the Minister gave a number of assurances, including that material encouraging or assisting suicide would be one of the priority offences, which means that, in practice, all in-scope platforms will have to remove this material quickly and will not be allowed to promote it in their algorithms. In all of this, of course, the devil is in the detail, as we know, and the noble Baroness’s Bill focuses our minds.
As I come to my closing remarks, I emphasise the point, which we have heard many times, that the creation of an offence of sending a communication that encourages self-harm is to be welcomed. However, as the Samaritans have pointed out, all such content needs to be regulated across all platforms for all users. Also, to use the words of the noble Baroness, Lady Finlay, turning 18 is a cliff edge at present and one that we do not want to keep, because turning 18 does not stop people being vulnerable to suicide or self-harm content.
Given this week’s lengthy debate, which was extremely welcome and well informed, the points in the Bill before us and what can be done in the Online Safety Bill, I hope that the Minister will give the assurance that any amendments that deal with the points before us today will come forward as soon as possible. We are keen to see those working texts. I am sure he will meet those who have a concern in this area.
I also ask the Minister to give your Lordships’ House full assurance that adults as well as children will be protected from dangerous suicide and self-harm content, and that it will not just be left to adults to deal with it themselves. In making that point, I once again emphasise the need for the Online Safety Bill to allow for proper media literacy so that adults and children are fully equipped. I look forward to hearing the Minister’s response.
My Lords, I am very grateful to the noble Baroness, Lady Finlay of Llandaff, for bringing forward her Bill, and to all noble Lords who have taken part in our debate, most particularly the noble Baroness, Lady Smith of Newnham, whose powerful, brave and personal words moved us all but also underlined the importance for so many families of the topic we are discussing today. The Government fully understand just how devastating these harms are, both to children and to adults, and the effect that those harms have on their families and friends, as well as the role that social media platforms and search engines can play in exacerbating them.
As the noble Baroness, Lady Finlay, outlined, her Bill was due to be read a second time the day after the death of Her late Majesty the Queen. That very sad reason for delay has meant that we are able to look at it alongside the Online Safety Bill, now before your Lordships’ House, which is helpful. I will endeavour to explain why the Government think that Bill deals with many of the issues raised, while keeping an open mind, as I said at its Second Reading on Wednesday, on suggestions for how it could do so more effectively.
I will first address the scope and intentions of the Online Safety Bill, particularly how it protects adults and children from horrific content such as this. As I outlined in our debate on Wednesday, the Online Safety Bill offers adult users a triple shield of protection, striking a balance between forcing platforms to be transparent about their actions and empowering adult users with tools to manage their experience online.
The first part of the shield requires all companies in scope of the Bill to tackle criminal activity online when it is flagged to them. They will have duties proactively to tackle priority illegal content and will need to prevent their services being used to facilitate the priority offences listed in the Bill, which include encouraging or assisting suicide.
The second part of the shield requires the largest user-to-user platforms, category 1 services under the Bill, to ensure that any terms of service they set are properly enforced. For instance, if a major social media platform says in its terms of service that it does not allow harmful suicide content, it must adhere to that. I will address this in greater detail in a moment, but Ofcom will have the power to hold platforms to their terms and conditions, which will help to create a safer, more transparent environment for all.
The third part of the shield requires category 1 services to provide adults with tools either to reduce the likelihood of encountering certain categories of content, if they so choose, or to alert them to the nature of that content. That includes content that encourages, promotes or provides instruction for suicide, self-harm or eating disorders. People will also have the ability to filter out content from unverified accounts, if they wish. That will give them the power to address the concern raised by my noble friend Lord Balfe about anonymous accounts. If anonymous accounts are pushing illegal content, the police already have powers through the Investigatory Powers Act to access communications data to bring the people behind that to book.
Through our triple shield, adult users will be empowered to make more informed choices about the services they use and have greater control over whom and what they engage with online.
As noble Lords know, child safety is a crucial component of the Online Safety Bill, and protecting children from harm remains our priority. As well as protecting children from illegal material, such as intentional encouragement of or assistance in suicide, all in-scope services likely to be accessed by children will be required to assess the risks to children on their service, and to provide safety measures to protect them from age-inappropriate and harmful content. This includes content promoting suicide, eating disorders and self-harm that does not meet a criminal threshold, as well as harmful behaviour such as cyberbullying.
Providers will also be required to consider, as part of their risk assessments, how functions such as algorithms could affect children’s exposure to illegal and other harmful content on their service. They must take steps to mitigate and manage any risks. Finally, providers may need to use age-assurance measures to identify the age of their users, to meet the child safety duties and to enforce age restrictions on their service.
A number of noble Lords talked about algorithms, so I will say a little more about that, repeating what I outlined on Wednesday. Under the Online Safety Bill, companies will need to take steps to mitigate the harm associated with their algorithms. That includes ensuring that algorithms do not promote illegal content, ensuring that predictive searches do not drive children towards harmful content and signposting children who search for harmful content towards resources and support.
Ofcom will also be given a range of powers to help it assess whether companies are fulfilling their duties in relation to algorithms. It will have powers to require information from companies about the operation of their algorithms, to interview employees, to require regulated service providers to undergo a skilled persons report, and to require audits of companies’ systems and processes. It will also have the power to inspect premises and access data and equipment, so the Bill is indeed looking at the harmful effects of algorithms.
Moreover, I am pleased that many of the ambitions that lie behind the noble Baroness’s Bill will be achieved through a new communications offence that will capture the intentional encouragement and assistance of self-harm, as noble Lords have highlighted today. That new offence will apply to all victims, adults as well as children, and is an important step forward in tackling such abhorrent content. The Government are considering how that offence should be drafted. We are working with colleagues at the Ministry of Justice and taking into account views expressed by the Law Commission. As I said on Wednesday, our door remains open and I am keen to discuss this with noble Lords from all parties and none to ensure we get this right. We look forward to further conversations with noble Lords between now and Committee.
Finally, I want briefly to mention how in our view the aims of the noble Baroness’s Bill risk duplicating some of the work the Government are taking forward in these areas. The Bill proposes requiring Ofcom to establish a unit to advise the Secretary of State on the use of user-to-user platforms and search engines to encourage and assist serious self-harm and activities associated with the risk of suicide. The unit’s advice would focus on the extent of harmful content, the effectiveness of current regulation and potential changes in regulation to help prevent these harms. The noble Baroness is right to raise the issue, and I think her Bill is intended to complement the Online Safety Bill regime to ensure that it remains responsive to the way in which specific harms develop over time.
On Wednesday we heard from my noble friend Lord Sarfraz about some of the emerging threats, but I hope I have reassured the noble Baroness and other noble Lords that suicide and self-harm content will be robustly covered by the regime that the Online Safety Bill sets up. It is up to Ofcom to determine how best to employ its resources to combat these harms effectively and swiftly. For instance, under the Online Safety Bill, Ofcom is required to build and maintain an in-depth understanding of the risks posed by in-scope services, meaning that the regime the Bill brings forward will remain responsive to the ways in which harms manifest themselves both online and offline, such as in cases of cyberstalking or cyberbullying.
The Government believe that Ofcom as the regulator is best placed to hold providers accountable and to respond to any failings in adhering to their codes of practice. It has the expertise to regulate and enforce the Online Safety Bill’s provisions and to implement the findings of its own research. Its work as the regulator will also consider evidence from experts across the sector, such as Samaritans, which has rightly been named a number of times today and kindly wrote to me ahead of this debate and our debate on the Online Safety Bill. We therefore think that this work covers the same ground as the advisory function of the unit proposed in the noble Baroness’s Bill, and I hope this has reassured her that the area that she highlights through it is indeed being looked at in the Government’s Bill.
That is why the Government believe that the Online Safety Bill now before your Lordships’ House represents the strong action that we need to prevent the encouragement or assistance of self-harm, suicide and related acts online, and why we think it achieves the same objectives as the noble Baroness’s Bill. It is further strengthened, as I say, by the new stand-alone offence that we are bringing forward which addresses communications that intentionally encourage or assist self-harm, about which I am happy to speak to noble Lords.
I am glad we have had the opportunity today, between Second Reading and Committee of that Bill, to look at this issue in detail, and I know we will continue to do so, both inside and outside the Chamber. For the reasons I have given, though, we cannot support the noble Baroness’s Private Member’s Bill today.
My Lords, I am extremely grateful to everyone who has spoken today. I am most grateful to the Minister for stressing that he is keeping an open mind and has an open door. Of course, a Private Member’s Bill should not conflict in any way with a really major piece of legislation. It has been clear that we all want the same thing: we want to make things safer, not less safe.
I am particularly grateful to the noble Baroness, Lady Smith of Newnham, for having shared with us the real issue of addiction that is behind so many of the behaviours that become harmful and the behaviours that capture people in extremely destructive behaviour. It is that addiction cycling the brain, born out of childhood trauma, that she illustrated to us so powerfully.
I am also grateful to all who have paid tribute to the parents who, in their pain, have had the courage to say, “We must do something.” They have been named in this Chamber.
The noble Baroness, Lady Blower, with her extensive awareness of education, has rightly highlighted how it is actually the young who move forward. The noble Baroness, Lady Merron, has pointed out that the data does not stop at 18; the tragedies carry on. As has also been pointed out by the noble Baroness, Lady Benjamin, it is students who kill themselves as well. Every university dreads the phone call that one of its students has killed themselves, and every university dreads discovering what it had missed in the antecedence to that disaster.
My noble friend Lady Grey-Thompson pointed out the important work that has come out of Swansea showing how viewing content really escalates the desire to self-harm; it is that hooking in that comes in. I am grateful to the noble Lord, Lord Balfe, for suggesting the wording of “apparent malicious content”, because of course there are people out there of malintent, and they will always make some nice wriggly excuse as to why what they are doing is not really harming anyone else.
Before I came into this debate, I had a call with my noble friend Lady Kidron about what is emerging about the metaverse. It is beyond anything that any of us have imagined; it is unbelievably harmful. As the noble Lord, Lord Clement-Jones, said, we must not be playing catch-up. It is the metaverse that will present the greatest threat, because it plays on mental distortion to expand it, and that increases the mental harms to everyone.
I am really grateful that we had this debate today, and I think it was timely that it came in between Second Reading and Committee on the Online Safety Bill. I assure the Minister that I and my noble friends within this Chamber on all Benches will be beating a path to his open door. I do not think he is going to be able to close it, and in fact he will not be able to lock it because we will just break it down. We need to move this forward and get it right. I beg to move.
(1 year, 8 months ago)
Lords ChamberMy Lords, I understand that no amendments have been set down to this Bill and that no noble Lord has indicated a wish to move a manuscript amendment or to speak in Committee. Unless, therefore, any noble Lord objects, I beg to move that the order of commitment be discharged.
(1 year, 5 months ago)
Lords ChamberMy Lords, I thank the Samaritans most sincerely for all the work they have done with me for a very long time on this Bill and its background. I also thank those parents and families who have shared the overwhelming distress and tragedy of discovering that their child, brother or sister had been goaded and pushed into suicide by exposure to repeated messages, coming particularly through the internet and often completely unknown to the family. That was the motivation behind this Bill.
I am also extremely grateful to the Government, officials from Ofcom and the noble Lord, Lord Grade, in particular, for the work they are doing to make sure that this scourge that happens to our young people is adequately tackled. We are in a strange position because the Online Safety Bill is in Committee here at the moment and we are about to debate a government amendment which I hope will help address this problem.
I also commend Ofcom on the way it is already developing robust risk assessment and risk management processes because it has recognised just how harmful some of this activity is. I also thank all Members of this House for the support they have given me at all times when we have discussed this Bill and for the recognition across the House, particularly from the noble Baroness, Lady Kidron, of the importance of tackling this major problem in our society. I beg to move.
My Lords, I give my warmest congratulations to the noble Baroness, Lady Finlay, on the progress of this very important Bill. I associate myself and my colleagues on these Benches with the thanks and appreciation extended to Ofcom for its involvement, to the Samaritans for their work not just on this Bill but day in, day out and of course to the bereaved families for their bravery and dignity in speaking out on this Bill and on so many other occasions, which I hope has really supported improvements for the future. I also thank the Minister and officials in his department and am grateful to noble Lords across the House, as ever, for their concern and consideration of this matter.
In seeing this Bill pass, I believe we honour those who have taken their own lives. I hope we give some small comfort and hope to the friends, families and communities who suffer the pain of tragedy and bereavement, having lost their loved ones. As we heard at Second Reading, the internet can be invaluable and positive in providing a space to speak openly and seek support but, regrettably, it can also mean content that encourages self-harm and suicide. At its worst, it is configured to bombard those who are at risk. We should reflect that the coroner ruled that the content that the late Molly Russell had viewed related to depression, self-harm and suicide and that it contributed to her death in more than just a minimal way.
As the noble Baroness said, the passage of this Bill coincides with the long-awaited Online Safety Bill; we will debate government amendments on this issue next Thursday as part of the group on communications offences. As the Minister would expect, we will seek a number of clarifications and, if necessary, any improvements. For today, I congratulate the noble Baroness, Lady Finlay, on her determination and work and wish this Bill all the very best as it continues on its path.
My Lords, I thank the noble Baroness, Lady Finlay, for tabling this Private Member’s Bill. Her knowledge and experience of these issues is highly regarded, rightly, on all sides of the House. I also thank all noble Lords who have contributed to this important debate so far. Like the noble Baroness, I call out the Samaritans for their ongoing brilliant work in this area.
As my noble friend Lord Parkinson set out at Second Reading and in Online Safety Bill Committee debates, the Government recognise the devastating impact of suicide and self-harm content, which has affected countless lives and families. We remain committed to addressing this material and giving vulnerable users the protection they deserve. While my department is leading this work, it is part of a cross-government approach which will go a long way to protecting people from suicide and self-harm content online.
I do not wish to repeat recent discussions, but I can assure the noble Baroness that the Online Safety Bill has been carefully designed to ensure that users are better protected from this content, with the strongest protections reserved for children. On top of this, we have tabled an amendment to the Bill to introduce a new self-harm offence, as has been mentioned, which noble Lords will have an opportunity to debate next week in Committee. Further, the powers granted to Ofcom via the legislation will protect users and negate the need for the noble Baroness’s Private Member’s Bill. Ofcom has the expertise to regulate and enforce the Bill’s provisions and implement its own research findings.
I thank the noble Baroness again for bringing her Bill to the House and facilitating this important debate, but I hope noble Lords are reassured of the Government’s extensive work in these areas and I hope that the noble Baroness will appreciate that, for the reasons set out, the Government cannot support this Private Member’s Bill.
My Lords, I am most grateful for the very generous words of the noble Baroness, Lady Merron, and for her understanding of the background to this. I am also grateful to the Government for the discussions we have had and recognise what has been said. We have more to debate. However, I emphasise that prevention of suicide and self-harm is essential and involves many different government departments and people across the whole of society. At the moment, I beg to move that this Bill do now pass.