Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I had to miss a few sessions of the Committee but I am now back until the end. I remind fellow Members of my interests: I worked for one of the largest platforms for a decade, but I have no current interests. It is all in the register if people care to look. I want to contribute to this debate on the basis of that experience of having worked inside the platforms.

I start by agreeing with the noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, and my noble friend Lord Clement-Jones. The thrust of their amendments—the idea that something will be needed here—is entirely correct. We have created in the Online Safety Bill a mechanism that we in this Committee know is intended primarily to focus on systems and how Ofcom regulates them, but what the public out there hear is that we are creating a mechanism that will meet their concerns—and their concerns will not end with systems. As the noble Baroness, Lady Newlove, eloquently described, their concerns in some instances will be about specific cases and the question will be: who will take those up?

If there is no other mechanism and no way to signpost people to a place where they can seek redress, they will come to Ofcom. That is something we do not want. We want Ofcom to be focused on the big-ticket items of dealing with systems, not bogged down in dealing with thousands of individual complaints. So we can anticipate a situation in which we will need someone to be able to deal with those individual complaints.

I want to focus on making that workable, because the volume challenge might not be as people expect. I have seen from having worked on the inside that there is a vast funnel of reports, where people report content to platforms. Most of those reports are spurious or vexatious; that is the reality. Platforms have made their reporting systems easy, as we want them to do —indeed, in the Bill we say, “Make sure you have really easy-to-use reporting systems”—but one feature of that is that people will use them simply to express a view. Over the last couple of weeks, all the platforms will have been inundated with literally millions of reports about Turkish politicians. These will come from the supporters of either side, reporting people on the other side—claiming that they are engaged in hate speech or pornography or whatever. They will use whatever tool they can. That is what we used to see day in, day out: football teams or political groups that report each other. The challenge is to separate out the signal—the genuinely serious reports of where something is going wrong—from the vast amount of noise, of people simply using the reporting system because they can. For the ombudsman, the challenge will be that signal question.

Breaking that down, from the vast funnel of complaints coming in, we have a smaller subset that are actionable. Some of those will be substantive, real complaints, where the individual simply disagrees with the decision. That could be primarily for two reasons. The first is that the platform has made a bad decision and failed to enforce its own policies. For example, you reported something as being pornographic, and it obviously was, but the operator was having a bad day—they were tired, it was late in the day and they pressed “Leave up” instead of “Take down”. That happens on a regular basis, and 1% of errors like that across a huge volume means a lot of mistakes being made. Those kinds of issues, where there is a simple operator error, should get picked up by the platforms’ own appeal mechanisms. That is what they are there for, and the Bill rightly points to that. A second reviewer should look at it. Hopefully they are a bit fresher, understand that a mistake was made and can simply reverse it. Those operator error reports can be dealt with internally.

The second type would be where the platform enforces policies correctly but, from the complainant’s point of view, the policies are wrong. It may be a more pro-free speech platform where the person says, “This is hate speech”, but the platform says, “Well, according to our rules, it is not. Under our terms of service, we permit robust speech of this kind. Another platform might not, but we do”. In that case, the complainant is still unhappy but the platform has done nothing wrong—unless the policies the platform is enforcing are out of step with the requirements under the Online Safety Bill, in which case the complaint should properly come to Ofcom. Based on the individual complaint, a complainant may have something material for Ofcom. They are saying that they believe the platform’s policies and systems are not in line with the guidance issued by Ofcom—whether on hate speech, pornography or anything else. That second category of complaint would come to Ofcom.

The third class concerns the kind of complaint that the noble Baroness, Lady Newlove, described. In some ways, this is the hardest. The platform has correctly enforced its policies but, in a particular case, the effect is deeply unfair, problematic and harmful for an individual. The platform simply says, “Look, we enforced the policies. They are there. This piece of content did not violate them”. Any outsider looking at it would say, “There is an injustice here. We can clearly see that an individual is being harmed. A similar piece of content might not be harmful to another individual, but to this individual it is”. In those circumstances, groups such as the South West Grid for Learning, with which I work frequently, perform an invaluable task. We should recognise that there is a network of non-governmental organisations in the United Kingdom that do this day in, day out. Groups such as the Internet Watch Foundation and many others have fantastic relations and connections with the platforms and regularly bring exceptional cases to them.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

We are glad to have the noble Lord back. I want also to put on the record that the South West Grid for Learning is very supportive of this amendment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

It has let me know as well. In a way, the amendment seeks to formalise what is already an informal mechanism. I was minded initially to support Amendment 56 in the name of my noble friend Lord Clement-Jones and the noble Lord, Lord Stevenson.

This landscape is quite varied. We have to create some kind of outlet, as the noble Baroness, Lady Kidron, rightly said. That parent or individual will want to go somewhere, so we have to send them somewhere. We want that somewhere to be effective, not to get bogged down in spurious and vexatious complaints. We want it to have a high signal-to-noise ratio—to pull out the important complaints and get them to the platforms. That will vary from platform to platform. In some ways, we want to empower Ofcom to look at what is and is not working and to be able to say, “Platform A has built up an incredible set of mechanisms. It’s doing a good job. We’re not seeing things falling through the cracks in the same way as we are seeing with platform B. We are going to have to be more directive with platform B”. That very much depends on the information coming in and on how well the platforms are doing their job already.

I hope that the Government are thinking about how these individual complaints will be dealt with and about the demand that will be created by the Bill. How can we have effective mechanisms for people in the United Kingdom who genuinely have hard cases and have tried, but where there is no intermediary for the platform they are worried about? In many cases, I suspect that these will be newer or smaller platforms that have arrived on the scene and do not have established relationships. Where are these people to go? Who will help them, particularly in cases where the platform may not systemically be doing anything wrong? Its policies are correct and it is enforcing them correctly, but any jury of peers would say that an injustice is being done. Either an exception needs to be made or there needs to be a second look at that specific case. We are not asking Ofcom to do this in the rest of the legislation.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is always somewhat intimidating to follow the noble Lord, Lord Allan, though it is wonderful to have him back from his travels. I too will speak in favour of Amendments 250A and 250B in the name of my noble friend, from not direct experience in the social media world but tangentially, from telecoms regulation.

I have lived, as the chief executive of a business, in a world where my customers could complain to me but also to an ombudsman and to Ofcom. I say this with some hesitation, as my dear old friends at TalkTalk will be horrified to hear me quoting this example, but 13 years ago, when I took over as chief executive, TalkTalk accounted for more complaints to Ofcom than pretty much all the other telcos put altogether. We were not trying to be bad—quite the opposite, actually. We were a business born out of very rapid growth, both organic and acquisitive, and we did not have control of our business at the time. We had an internal complaints process and were trying our hardest to listen to it and to individual customers who were telling us that we were letting them down, but we were not doing that very well.

While my noble friend has spoken so eloquently about the importance of complaints mechanisms for individual citizens, I am actually in favour of them for companies. I felt the consequences of having an independent complaints system that made my business listen. It was a genuine failsafe system. For someone to have got as far as complaining to the telecoms ombudsman and to Ofcom, they had really lost the will to live with my own business. That forced my company to change. It has forced telecoms companies to change so much that they now advertise where they stand in the rankings of complaints per thousand customers. Even in the course of the last week, Sky was proclaiming in its print advertising that it was the least complained-about to the independent complaints mechanism.

So this is not about thinking that companies are bad and are trying to let their customers down. As the noble Lord, Lord Allan, has described, managing these processes is really hard and you really need the third line of defence of an independent complaints mechanism to help you deliver on your best intentions. I think most companies with very large customer bases are trying to meet those customers’ needs.

For very practical reasons, I have experienced the power of these sorts of systems. There is one difference with the example I have given of telecoms: it was Ofcom itself that received most of those complaints about TalkTalk 13 years ago, and I have tremendous sympathy with the idea that we might unleash on poor Ofcom all the social media complaints that are not currently being resolved by the companies. That is exactly why, as Dame Maria Miller said, we need to set up an independent ombudsman to deal with this issue.

From a very different perspective from that of my noble friend, I struggle to understand why the Government do not want to do what they have just announced they want to do in other sectors such as gambling.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

As I said, we are happy to consider individual complaints and super-complaints further.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

Again, I am just pulling this together—I am curious to understand this. We have been given a specific case—South West Grid for Learning raising a case based on an individual but that had more generic concerns—so could the noble Viscount clarify, now or in writing, whether that is the kind of thing that he imagines would constitute a super-complaint? If South West Grid for Learning went to a platform with a complaint like that—one based on an individual but brought by an organisation—would Ofcom find that complaint admissible under its super-complaints procedure, as imagined in the Bill?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

Overall, the super-complaints mechanism is more for groupings of complaints and has a broader range than the individual complaints process, but I will consider that point going forward.

Many UK regulators have successful super-complaints mechanisms which allow them to identify and target emerging issues and effectively utilise resources. Alongside the Bill’s research functions, super-complaints will perform a vital role in ensuring that Ofcom is aware of the issues users are facing, helping them to target resources and to take action against systemic failings.

On the steps required after super-complaints, the regulator will be required to respond publicly to the super-complaint. Issues raised in the super-complaint may lead Ofcom to take steps to mitigate the issues raised in the complaint, where the issues raised can be addressed via the Bill’s duties and powers. In this way, they perform a vital role in Ofcom’s horizon-scanning powers, ensuring that it is aware of issues as they emerge. However, super-complaints are not linked to any specific enforcement process.

--- Later in debate ---
Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I particularly support Amendment 96, to which I have added my name; it is a privilege to do so. I also support Amendment 296 and I cannot quite work out why I have not added my name to it, because I wholeheartedly agree with it, but I declare my support now.

I want to talk again about an issue that the noble Baroness, Lady Finlay, set out so well and that we also touched on last week, about the regulation of suicide and self-harm content. We have all heard of the tragic case of Molly Russell, but a name that is often forgotten in this discussion is Frankie Thomas. Frankie was a vulnerable teenager with childhood trauma, functioning autism and impulsivity. After reading a story about self-harm on the app Wattpad, according to the coroner’s inquest, she went home and undertook

“a similar act, resulting in her death”.

I do not need to repeat the many tragic examples that have already been shared in this House, but I want to reiterate the point already made by the BMA in its very helpful briefing on these amendments: viewing self-harm and suicide content online can severely harm the user offline. As I said last week when we were debating the user empowerment tools, this type of content literally has life or death repercussions. It is therefore essential that the Bill takes this sort of content more seriously and creates specific duties for services to adhere to.

We will, at some point this evening—I hope—come on to debate the next group of amendments. The question for Ministers to answer on this group, the next one and others that we will be debating is, where we know that content is harmful to society—to individuals but also to broader society—why the Government do not want to take the step of setting out how that content should be properly regulated. I think it all comes from their desire to draw a distinction between content that is illegal and content that is not illegal but is undoubtedly, in the eyes of pretty well every citizen, deeply harmful. As we have already heard from the noble Baroness, and as we heard last week, adults do not become immune to suicide and self-harm content the minute they turn 18. In fact, I would argue that no adult is immune to the negative effects of viewing this type of content online.

This amendment, therefore, is very important, as it would create a duty for providers of regulated user-to-user services and search engines to manage harmful suicide or self-harm content applicable to both children and adults, recognising this cliff edge otherwise in the Bill, which we have already talked about. I strongly urge noble Lords, particularly the Minister, to agree that protecting users from this content is one of the most important things that the Bill can do. People outside this House are looking to us to do this, so I urge the Government to support this amendment today.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am pleased that we have an opportunity, in this group of amendments, to talk about suicide and self-harm content, given the importance of it. It is important to set out what we expect to happen with this legislation. I rise particularly to support Amendment 225, to which my noble friend Lady Parminter added her name. I am doing this more because the way in which this kind of content is shared is incredibly complex, rather than simply because of the question of whether it is legal or illegal.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

From experience, I think it is true that companies get defensive and seek to defend the indefensible on occasion. I agree with the noble Baroness on that, but I will balance it a little as I also work with people who were agonising over not wanting to make a bad situation worse. They were genuinely struggling and seeking to do the right thing. That is where the experts come in. If someone would say to them, “Look, take this stuff down; that is always better”, it would make their lives easier. If they said, “Please leave it up”, they could follow that advice. Again, that would make their lives easier. On the excuses, I agree that sometimes they are defending the indefensible, but also there are people agonising over the right thing to do and we should help them.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I absolutely agree. Of course, good law is a good system, not a good person.

I turn to the comments that I was going to make. Uncharacteristically, I am a little confused about this issue and I would love the Minister’s help. My understanding on reading the Bill very closely is that self-harm and suicide content that meets a legal definition will be subject to the priority illegal content duties. In the case of children, we can safely anticipate that content of this kind will be named primary priority content. Additionally, if such content is against the terms of service of a regulated company, it can be held responsible to those terms. It will have to provide a user empowerment tool on category 1 services so that it can be toggled out if an adult user wishes. That is my understanding of where this content has already been dealt with in the Bill. To my mind, this leaves the following ways in which suicide and self-harm material, which is the subject of this group of amendments, is not covered by the Bill. That is what I would like the Minister to confirm, and I absolutely stand by to be corrected.

In the case of adults, if self-harm and suicide material does not meet a bar of illegal content and the service is not category 1, there is no mechanism to toggle it out. Ofcom has no power to require a service to ensure tools to toggle self-harm and suicide material out by default. This means that self-harm and suicide material can be as prevalent as they like—pushed, promoted and recommended, as I have just explained—if it is not contrary to the terms of service, so long as it does not reach the bar of illegal content.

Search services are not subject to these clauses— I am unsure about that. In the case of both children and adults, if self-harm and suicide material is on blogs or services with limited functionality, it is out of scope of the Bill and there is absolutely nothing Ofcom can do. For non-category 1 services—the majority of services which claim that an insignificant number of children access their site and thus that they do not have to comply with the child safety duties—there are no protections for a child against this content.

I put it like that because I believe that each of the statements I just made could have been fixed by amendments already discussed during the past six days in Committee. We are currently planning to leave many children without the protection of the safety duties, to leave vulnerable adults without even the cover of default protections against material that has absolutely no public interest and to leave companies to decide whether to promote or use this material to fuel user engagement—even if it costs well-being and lives.

I ask the Minister to let me know if I have misunderstood, but I think it is really quite useful to see what is left once the protections are in place, rather than always concentrating on the protections themselves.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My Lords, like everyone who spoke, I and the Government recognise the tragic consequences of suicide and self-harm, and how so many lives and families have been devastated by it. I am grateful to the noble Baroness and all noble Lords, as well as the bereaved families who campaigned so bravely and for so long to spare others that heartache and to create a safer online environment for everyone. I am grateful to the noble Baroness, Lady Finlay of Llandaff, who raised these issues in her Private Member’s Bill, on which we had exchanges. My noble friend Lady Morgan is right to raise the case of Frankie Thomas and her parents, and to call that to mind as we debate these issues.

Amendments 96 and 296, tabled by the noble Baroness, Lady Finlay, would, in effect, reintroduce the former adult safety duties whereby category 1 companies were required to assess the risk of harm associated with legal content accessed by adults, and to set and enforce terms of service in relation to it. As noble Lords will know, those duties were removed in another place after extensive consideration. Those provisions risked creating incentives for the excessive removal of legal content, which would unduly interfere with adults’ free expression.

However, the new transparency, accountability and freedom of expression duties in Part 4, combined with the illegal and child safety duties in Part 3, will provide a robust approach that will hold companies to account for the way they deal with this content. Under the Part 4 duties, category 1 services will need to have appropriate systems and processes in place to deal with content or activity that is banned or restricted by their terms of service.

Many platforms—such as Twitter, Facebook and TikTok, which the noble Baroness raised—say in their terms of service that they restrict suicide and self-harm content, but they do not always enforce these policies effectively. The Bill will require category 1 companies—the largest platforms—fully to enforce their terms of service for this content, which will be a significant improvement for users’ safety. Where companies allow this content, the user-empowerment duties will give adults tools to limit their exposure to it, if they wish to do so.

The noble Baroness is right to raise the issue of algorithms. As the noble Lord, Lord Stevenson, said, amplification lies at the heart of many cases. The Bill will require providers specifically to consider as part of their risk assessments how algorithms could affect children’s and adults’ exposure to illegal content, and content that is harmful to children, on their services. Providers will need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet the illegal content and child safety duties in the Bill.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Following our earlier discussion, we were going to have a response on super-complaints. I am curious to understand whether we had a pattern of complaints—such as those the noble Baroness, Lady Kidron, and others received—about a platform saying, under its terms of service, that it would remove suicide and self-harm content but failing to do so. Does the Minister think that is precisely the kind of thing that could be substantive material for an organisation to bring as a super-complaint to Ofcom?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

My initial response is, yes, I think so, but it is the role of Ofcom to look at whether those terms of service are enforced and to act on behalf of internet users. The noble Lord is right to point to the complexity of some marginal cases with which companies have to deal, but the whole framework of the Bill is to make sure that terms of service are being enforced. If they are not, people can turn to Ofcom.

--- Later in debate ---
The noble Baroness asked about the metaverse, which is in scope of the Bill as a user-to-user service. The approach of the Bill is to try to remain technology neutral.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I will plant a flag in reference to the new offences, which I know we will come back to again. It is always helpful to look at real-world examples. There is a lot of meme-based self-harm content. Two examples are the Tide Pods challenge—the eating of detergent capsules—and choking games, both of which have been very common and widespread. It would be helpful, ahead of our debate on the new offences, to understand whether they are below or above the threshold of serious self-harm and what the Government’s intention is. There are arguments both ways: obviously, criminalising children for being foolish carries certain consequences, but we also want to stop the spread of the content. So, when we come to that offence, it would be helpful if the Minister could use specific examples, such as the meme-based self-harm content, which is quite common.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I thank the noble Lord for the advance notice to think about that; it is helpful. It is difficult to talk in general terms about this issue, so, if I can, I will give examples that do, and do not, meet the threshold.

The Bill goes even further for children than it does for adults. In addition to the protections from illegal material, the Government have indicated, as I said, that we plan to designate content promoting suicide, self-harm or eating disorders as categories of primary priority content. That means that providers will need to put in place systems designed to prevent children of any age encountering this type of content. Providers will also need specifically to assess the risk of children encountering it. Platforms will no longer be able to recommend such material to children through harmful algorithms. If they do, Ofcom will hold them accountable and will take enforcement action if they break their promises.

It is right that the Bill takes a different approach for children than for adults, but it does not mean that the Bill does not recognise that young adults are at risk or that it does not have protections for them. My noble friend Lady Morgan was right to raise the issue of young adults once they turn 18. The triple shield of protection in the Bill will significantly improve the status quo by protecting adults, including young adults, from illegal suicide content and legal suicide or self-harm content that is prohibited in major platforms’ terms and conditions. Platforms also have strong commercial incentives, as we discussed in previous groups, to address harmful content that the majority of their users do not want to see, such as legal suicide, eating disorder or self-harm content. That is why they currently claim to prohibit it in their terms and conditions, and why we want to make sure that those terms and conditions are transparently and accountably enforced. So, while I sympathise with the intention from the noble Baroness, Lady Finlay, her amendments raise some wider concerns about mandating how providers should deal with legal material, which would interfere with the careful balance the Bill seeks to strike in ensuring that users are safer online without compromising their right to free expression.

The noble Baroness’s Amendment 240, alongside Amendment 225 in the name of the noble Lord, Lord Stevenson, would place new duties on Ofcom in relation to suicide and self-harm content. The Bill already has provisions to provide Ofcom with broad and effective information-gathering powers to understand how this content affects users and how providers are dealing with it. For example, under Clause 147, Ofcom can already publish reports about suicide and self-harm content, and Clauses 68 and 69 empower Ofcom to require the largest providers to publish annual transparency reports.

Ofcom may require those reports to include information on the systems and processes that providers use to deal with illegal suicide or self-harm content, with content that is harmful to children, or with content which providers’ own terms of service prohibit. Those measures sit alongside Ofcom’s extensive information-gathering powers. It will have the ability to access the information it needs to understand how companies are fulfilling their duties, particularly in taking action against this type of content. Furthermore, the Bill is designed to provide Ofcom with the flexibility it needs to respond to harms—including in the areas of suicide, self-harm and eating disorders—as they develop over time, in the way that the noble Baroness envisaged in her remarks about the metaverse and new emerging threats. So we are confident that these provisions will enable Ofcom to assess this type of content and ensure that platforms deal with it appropriately. I hope that this has provided the sufficient reassurance to the noble Baroness for her not to move her amendment.