(1 year, 6 months ago)
Lords ChamberMy Lords, as we have said many times, this is a complex Bill. As we reflect on the priorities for Report, we can be more relaxed about some of the specifics on how Ofcom may operate, thereby giving it more flexibility—the flexibility it needs to be agile in the online world—if we as a Parliament trust Ofcom. Building trust, I believe, is a triangulation. First, there is independence from government—as discussed in respect of Secretary of State powers. Secondly, we need proper scrutiny by Parliament. Earlier today I talked about my desire for there to be proper post-legislative scrutiny and a permanent Joint Committee to do that. The third leg of the stool is the transparency to assist that scrutiny.
Clause 68 contains the provisions which would require category 1, 2A and 2B services to produce an annual transparency report containing information described by Ofcom in a notice given to the service. Under these provisions, Ofcom would be able to require these services to report on, among other things: information about the incidence of illegal content and content that is harmful to children; how many users are assumed to have encountered this content by means of the service; the steps and processes for users to report this content; and the steps and processes which a provider uses for dealing with this content.
We welcome the introduction of transparency reporting in relation to illegal content and content that is harmful to children. We agree with the Government that effective transparency reporting plays a crucial role in building Ofcom’s understanding of online harms and empowering users to make a more informed choice about the services they use.
However, despite the inclusion of transparency reporting in the Bill representing a step in the right direction, we consider that these requirements could and should be strengthened to do the trust building we think is important. First, the Bill should make clear that, subject to appropriate redactions, companies will be required to make their transparency reports publicly available—to make them transparent—hence Amendment 160A.
Although it is not clear from the Bill whether companies will be required to make these reports publicly available, we consider that, in most instances, such a requirement would be appropriate. As noted, one of the stated purposes of transparency reporting is that it would enable service users to make more informed choices about their own and their children’s internet use—but they can only do so if the reports are published. Moreover, in so far as transparency reporting would facilitate public accountability, it could also act as a powerful incentive for service providers to do more to protect their users.
We also recognise that requiring companies to publish the incidences of CSEA content on their platforms, for instance, may have the effect of encouraging individuals seeking such material towards platforms on which there are high incidences of that content—that must be avoided. I recognise that simply having a high incidence of CSEA content on a platform does not necessarily mean that that platform is problematic; it could just mean that it is better at reporting it. So, as ever with the Bill, there is a balance to be struck.
Therefore, we consider that the Bill should make it explicit that, once provided to Ofcom, transparency reports are to be made publicly available, subject to redactions. To support this, Ofcom should be required to produce guidance on the publication of transparency reports and the redactions that companies should make before making reports publicly accessible. Ofcom should also retain the power to stop a company from publishing a particular transparency report if it considers that the risk of directing individuals to illegal materials outweighs the benefit of making a report public—hence Amendments 160B and 181A.
Amendments 165 and 229 are in my noble friend Lord Stevenson’s name. Amendment 165 would broaden the transparency requirements around user-to-user services’ terms of service, ensuring that information can be sought on the scope of these terms, not just their application. As I understand it, scope is important to understand, as it is significant in informing Ofcom’s regulatory approach. We are trying to guard against minimal terms of service where detail is needed for users and Ofcom.
The proposed clause in Amendment 229 probes how Ofcom will review the effectiveness of the transparency requirements in the Bill. It would require Ofcom to undertake a review of the effectiveness of transparency reports within three years and every five years thereafter, and it would give the Secretary of State powers to implement any recommendations made by the regulator. The Committee should note that we also include a requirement that a Select Committee, charged by the relevant House, must consider and report on the regulations, with an opportunity for Parliament to debate them. So we link the three corners of the triangle rather neatly there.
If we agree that transparency is an important part of building trust in Ofcom in doing this difficult and innovative regulatory job—it is always good to see the noble Lord, Lord Grade, in his place; I know he is looking forward to getting on with this—then this proposed clause is sensible. I beg to move.
My Lords, I am pleased that the noble Lord, Lord Knight of Weymouth, has given us an opportunity to talk about transparency reports with these amendments, which are potentially a helpful addition to the Bill. Transparency is one of the huge benefits that the legislation may bring. One of the concerns that the public have and that politicians have always had with online platforms is that they appear to be a black box—you cannot see what is going on in them.
In the entire edifice that we are constructing in the Online Safety Bill, there are huge opportunities to change that. The platforms will have to do risk assessments —there are measures in the Bill to make sure that information about these is put out—and they will have to take active steps to mitigate any risks they find. Again, we may get directions and guidance from Ofcom that will explain to the public exactly what is expected of them. The final piece of the jigsaw is the transparency reports that show the outcomes—how a platform has performed and what it has done to meet its obligations in dealing with content and behaviour on its services.
For the record, I previously worked for one of the platforms, and I would have said that I was on the pro-transparency wing of the transparency party inside the company. I believed that it was in the platform’s interest: if you do not tell people what you are doing, they will make things up about you, and what they make up will generally be worse than what you are actually doing. So there are huge advantages to the platforms from being transparent.
The noble Lord, Lord Knight, has picked up on some important points in his Amendment 160B, which talks about making sure that the transparency report is not counterproductive by giving the bad guys information that they could use to ill effect. That is a valid point; it is often debated inside the platforms. Sometimes, I argued furiously with my colleagues in the platforms about why we should disclose information. They would ask, “What about the bad guys?” Sometimes I challenged that, but other times it would have been a genuine and accurate concern. The noble Lord mentioned things such as child sexual abuse material, and we have to recognise that the bad guys are incredibly devious and creative, and if you show them anything that they can use against you to get around your systems, they will try to do that. That is a genuine and valid concern.
The sort of thing that you might put into a transparency report is, for example, whether you have banned particular organisations. I would be in favour of indicating to the public that an organisation is banned, but you can see that the potential impact of that is that all the people you are concerned about would create another organisation with a different name and then get back on to your platform. We need to be alive to those kinds of concerns.
It is also relevant to Amendment 165 and the terms of service that the more granular and detailed your terms of service are, the better they are for public information, but there are opportunities to get around them. Again, we would have that argument internally. I would say, “If we are prohibiting specific hate speech terms, tell people that, and then they won’t use them”. For me, that would be a success, as they are not using those hate speech terms anymore, but, of course, they may then find alternative hate speech terms that they can use instead. You are facing that battle all the time. That is a genuine concern that I hope we will be able to debate. I hope that Ofcom will be able to mitigate that risk by discussing with platforms what these transparency reports should look like. In a sense, we are doing a risk assessment of the transparency report process.
Amendment 229 on effectiveness is really interesting. My experience was that if you did not have a transparency report, you were under huge pressure to produce one and that once you produced one, nobody was interested. For fear of embarrassing anyone in the Committee, I would be curious to know how many noble Lords participating in this debate have read the transparency reports already produced by Meta Platforms, Google and others. If they have not read them, they should not be embarrassed, because my experience was that I would talk to regulators and politicians about something they had asked me to come in to talk about, such as hate speech or child sexual abuse material, and I learned to print off the transparency report. I would go in and say, “Well, you know what we are doing; it’s in our transparency report”. They would ask, “What transparency report?”, and I would have to show them. So, having produced a transparency report, every time we published it, we would expect there to be public interest, but little use was made of it. That is not a reason not to do them—as I said, I am very much in favour of doing them—but, on their own, they may not be effective, and Amendment 229 touches on that.
I was trying to think of a collective noun for transparency reports and, seeing as they shed light, I think it may be a “chandelier”. Where we may get the real benefit is if Ofcom can produce a chandelier of transparency reports, taking all the information it gets from the different platforms, processing it and selecting the most relevant information—the reports are often too long for people to work their way through—so that it can enable comparisons. That is really good and it is quite good for the industry that people know that platform A did this, platform B did that, and platform C did something else. They will take note of that, compare with each other and want to get into the best category. It is also critical that Ofcom puts this into user-friendly language, and Ofcom has quite a good record of producing intelligible reports. In the context of Amendment 229, a review process is good. One of the things that might come out of that, thinking ahead, would be Ofcom’s role in producing meta transparency reports, the chandelier that will shed light on what the whole sector is doing.
My Lords, for once I want to be really positive. I am actually very positive about this whole group of amendments because more transparency is essential in what we are discussing. I especially like Amendment 165 from the noble Lord, Lord Stevenson of Balmacara, because it is around terms of service for user-to-user services and ensures that information can be sought on the scope as well as the application. This is important because so much has been put on user-to-user services as well as on terms of service. You need to know what is going on.
I want particularly to compliment Amendment 229 that says that transparency reports should be
“of sufficient quality to enable service users and researchers to make informed judgements”,
et cetera. That is a very elegant way in which to say that they should not be gobbledegook. If we are going to have them, they should be clear and of a quality that we can read. Obviously, we do not want them to be unreadable and full of jargon and legalistic language. I am hoping that that is the requirement.
My Lords, I strongly support the amendment in the names of the noble Lords, Lord Knight and Lord Stevenson, as well as my noble friend Lady Featherstone. The essence of the message from the noble Lord, Lord Knight, about the need for trust and the fact that you can gain trust through greater transparency is fundamental to this group.
The Joint Committee’s report is now a historical document. It is partly the passage of time, but it was an extraordinary way in which to work through some of the issues, as we did. We were very impacted by the evidence given by Frances Haugen, and the fact that certain things came to light only as a result of her sharing information with the Securities and Exchange Commission. We said at the time that:
“Lack of transparency of service providers also means that people do not have insight into the prevalence and nature of activity that creates a risk of harm on the services that they use”.
That is very much the sense that the noble Lord, Lord Stevenson, is trying to get to by adding scope as well.
We were very clear about our intentions at the time. The Government accepted the recommendation that we made and said that they agreed with the committee that
“services with transparency reporting requirements should be required to publish their transparency reports in full, and in an accessible and public place”.
So what we are really trying to do is to get the Government to agree to what they have already agreed to, which we would have thought would be a relatively straightforward process.
There are some other useful aspects, such as the review of effectiveness of the transparency requirements. I very much appreciate what my noble friend just said about not reading transparency reports. I read the oversight reports but not necessarily the transparency reports. I am not sure that Frances Haugen was a great advert for transparency reports at the time, but that is a mere aside in the circumstances.
I commend my noble friend Lady Featherstone’s Amendment 171, which is very consistent with what we were trying to achieve with the code of practice about violence against women and girls. That would fit very easily within that. One of the key points that my noble friend Lord Allan made is that this is for the benefit of the platforms as well. It is not purely for the users. Of course it is useful for the users, but not exclusively, and this could be a way of platforms engaging with the users more clearly, inserting more fresh air into this. In these circumstances it is pretty conclusive that the Government should adhere to what they agreed to in their response to the Joint Committee’s report.
As ever, I thank all noble Lords who have spoken. I absolutely take, accept and embrace the point that transparency is wholly critical to what we are trying to achieve with the Bill. Indeed, the chandelier of transparency reports should be our shared aim—a greenhouse maybe. I am grateful for everyone’s contributions to the debate. I agree entirely with the views expressed. Transparency is vital in holding companies to account for keeping their users safe online. As has been pointed out, it is also to the benefit of the platforms themselves. Confident as I am that we share the same objectives, I would like to try to reassure noble Lords on a number of issues that have been raised.
Amendments 160A, 160B and 181A in the name of the noble Lord, Lord Knight of Weymouth, seek to require providers to make their transparency reports publicly available, subject to appropriate redactions, and to allow Ofcom to prevent their publication where it deems that the risks posed by drawing attention to illegal content outweigh the benefit to the public of the transparency report. Let me reassure the noble Lord that the framework, we strongly believe, already achieves the aim of those amendments. As set out in Clause 68, Ofcom will specify a range of requirements in relation to transparency reporting in a notice to categories 1, 2A and 2B. This will include the kind of information that is required in the transparency report and the manner in which it should be published. Given the requirement to publish the information, this already achieves the intention of Amendment 160A.
The specific information requested for inclusion within the transparency report will be determined by Ofcom. Therefore, the regulator will be able to ensure that the information requested is appropriate for publication. Ofcom will take into account any risks arising from making the information public before issuing the transparency notice. Ofcom will have separate information-gathering powers, which will enable the regulator to access information that is not suitable to be published in the public domain. This achieves the intention of Amendment 160B. There is also a risk of reducing trust in transparency reporting if there is a mechanism for Ofcom to prevent providers publishing their transparency reports.
Amendment 181A would require Ofcom to issue guidance on what information should be redacted and how this should be done. However, Ofcom is already required to produce guidance about transparency reports, which may include guidance about what information should be redacted and how to do this. It is important to provide the regulator with the flexibility to develop appropriate guidance.
Amendment 165 seeks to expand the information within the transparency reporting requirements to cover the scope of the terms of service set out by user-to-user providers. I very much agree with the noble Lord that it is important that Ofcom can request information about the scope of terms of service, as well as about their application. Our view is that the Bill already achieves this. Schedule 8 sets out the high-level matters about which information may be required. This includes information about how platforms are complying with their duties. The Bill will place duties on user-to-user providers to ensure that any required terms of service are clear and accessible. This will require platforms to set out what the terms of service cover—or, in other words, the scope. While I hope that this provides reassurance on the matter, if there are still concerns in spite of what I have said, I am very happy to look at this. Any opportunity to strengthen the Bill through that kind of clarity is worth looking at.
I welcome the Minister’s comments. I am interrupting just because this is my amendment rather than my noble friend Lord Knight’s. The word “scope” caused us some disquiet on this Bench when we were trying to work out what we meant by it. It has been fleshed out in slightly different ways around the Chamber, to advantage.
I go back to the original intention—I am sorry for the extensive introduction, but it is to make sure that I focus the question correctly—which was to make sure that we are not looking historically at the terms of reference that have been issued, and whether they are working in a transparency mode, but addressing the question of what is missing or is perhaps not addressed properly. Does the Minister agree that that would be taken in by the word “scope”?
I think I probably would agree, but I would welcome a chance to discuss it further.
Finally, Amendment 229 intends to probe how Ofcom will review the effectiveness of transparency requirements in the Bill. It would require Ofcom to produce reports reviewing the effectiveness of transparency reports and would give the Secretary of State powers to implement any recommendations made by the regulator. While I of course agree with the sentiment of this amendment, as I have outlined, the transparency reporting power is designed to ensure that Ofcom can continuously review the effectiveness of transparency reports and make adjustments as necessary. This is why the Bill requires Ofcom to set out in annual transparency notices what each provider should include in its reports and the format and manner in which it should be presented, rather than putting prescriptive or static requirements in the Bill. That means that Ofcom will be able to learn, year on year, what will be most effective.
Under Clause 145, Ofcom is required to produce its own annual transparency report, which must include a summary of conclusions drawn from providers’ transparency reports, along with the regulator’s view on industry best practice and other appropriate information—I hope and think that goes to some of the points raised by the noble Lord, Lord Allan of Hallam.
My Lords, just before the Minister moves on—and possibly to save me finding and reading it—can he let us know whether those annual reports by Ofcom will be laid before Parliament and whether Parliament will have a chance to debate them?
I believe so, but I will have to confirm that in writing. I am sorry not to be able to give a rapid answer.
Clause 159 requires the Secretary of State to review in total the operation of the regulatory framework to ensure it is effective. In that review, Ofcom will be a statutory consultee. The review will specifically require an assessment of the effectiveness of the regulatory framework in ensuring that the systems and processes used by services provide transparency and accountability to users.
The Bill will create what we are all after, which is a new culture of transparency and accountability in the tech sector. For the reasons I have laid out, we are confident that the existing provisions are sufficiently broad and robust to provide that. As such, I hope the noble Lord feels sufficiently reassured to withdraw the amendment.
My Lords, that was a good, quick debate and an opportunity for the noble Viscount to put some things on the record, and explain some others, which is helpful. It is always good to get endorsement around what we are doing from both the noble Lord, Lord Allan, and the noble Baroness, Lady Fox. That is a great spread of opinion. I loved the sense of the challenge as to whether anyone ever reads the transparency reports whenever they are published; I imagine AI will be reading and summarising them, and making sure they are not written as gobbledygook.
On the basis of what we have heard and if we can get some reassurance that strong transparency is accompanied by strong parliamentary scrutiny, then I am happy to withdraw the amendment.
My Lords, I have Amendments 185A and 268AA in this group. They are on different subjects, but I will deal with them in the same contribution.
Amendment 185A is a new clause that would introduce duties on online marketplaces to limit child access to listings of knives and take proactive steps to identify and remove any listings of knives or products such as ornamental zombie knives that are suggestive of acts of violence or self-harm. I am sure the Minister will be familiar with the Ronan Kanda case that has given rise to our bringing this amendment forward. The case is particularly horrible; as I understand it, sentencing is still outstanding. Two young boys bought ninja blades and machetes online and ultimately killed another younger boy with them. It has been widely featured in news outlets and is particularly distressing. We have had some debate on this in another place.
As I understand it, the Government have announced a consultation on this, among other things, looking at banning the sale of machetes and knives that appear to have no practical use other than being designed to look menacing or suitable for combat. We support the consultation and the steps set out in it, but the amendment provides a chance to probe the extent to which this Bill will apply to the dark web, where a lot of these products are available for purchase. The explanatory statement contains a reference to this, so I hope the Minister is briefed on the point. It would be very helpful to know exactly what the Government’s intention is on this, because we clearly need to look at the sites and try to regulate them much better than they are currently regulated online. I am especially concerned about the dark web.
The second amendment relates to racist abuse; I have brought the subject before the House before, but this is rather different. It is a bit of a carbon copy of Amendment 271, which noble Lords have already debated. It is there for probing purposes, designed to tease out exactly how the Government see public figures, particularly sports stars such as Marcus Rashford and Bukayo Saka, and how they think they are supposed to deal with the torrents of racist abuse that they receive. I know that there have been convictions for racist content online, but most of the abuse goes unpunished. It is not 100% clear that much of it will be identified and removed under the priority offence provisions. For instance, does posting banana emojis in response to a black footballer’s Instagram post constitute an offence, or is it just a horrible thing that people do? We need to understand better how the law will act in this field.
There has been a lot of debate about this issue, it is a very sensitive matter and we need to get to the bottom of it. A year and a half ago, the Government responded to my amendment bringing online racist abuse into the scope of what is dealt with as an offence, which we very much welcomed, but we need to understand better how these provisions will work. I look forward to the Minister setting that out in his response. I beg to move.
My Lords, I rise to speak primarily to the amendments in the name of my noble friend Lord Clement-Jones, but I will also touch on Amendment 268AA at the same time. The amendments that I am particularly interested in are Amendments 200 and 201 on regulatory co-operation. I strongly support the need for this, and I will illustrate that with some concrete examples of why this is essential to bring to life the kinds of challenges that need to be dealt with.
The first example relates to trying to deal with the sexual grooming of children online, where platforms are able to develop techniques to do that. They can do that by analysing the behaviour of users and trying to detect whether older users are consistently trying to approach younger users, and the kind of content of the messages they may be sending to them where that is visible. These are clearly highly intrusive techniques. If a platform is subject to the general data protection regulation, or the UK version of that, it needs to be very mindful of privacy rights. We clearly have, there, two potentially interested bodies in the UK environment. We have the child protection agencies, and we will have, in future, Ofcom seeking to ensure that the platform has met its duty of care, and we will have the Information Commission’s Office.
A platform, in a sense, can be neutral as to what it is instructed to do by the regulator. Certainly, my experience was that the platforms wanted to do those kinds of activities, but they are neutral in the sense that they will do what they are told is legal. There, you need clarity from the regulators together to say, “Yes, we have looked at this and you are not going to do something on the instruction of the child safety agency and then get criticised, and potentially fined, by the Data Protection Agency for doing the thing you have been instructed to do”—so we need those agencies to work together.
The second example is in the area of co-operation around antiterrorism, another key issue. The platforms have created something called the Global Internet Forum to Counter Terrorism. Within that forum, they share tools and techniques—things such as databases of information about terrorist content and systems that you can use to detect them—and you are encouraged within that platform to share those tools and techniques with smaller platforms and competitors. Clearly, again, there is a very significant set of questions, and if you are in a discussion around that, the lawyers will say, “Have the competition lawyers cleared this?” Again, therefore, something that is in the public interest—that all the platforms should be using similar kinds of technology to detect terrorist content—is something where you need a view not just from the counterterrorism people but also, in our case, from the Competition and Markets Authority. So, again, you need those regulators to work together.
The final example is one which I know is dear to the heart of the noble Baroness, Lady Morgan of Cotes, which is fraudsters, which we have dealt with, where you might have patterns of behaviour where you have information that comes from the telecoms companies regulated by Ofcom, the internet service providers, regulated by Ofcom, and financial institutions, regulated by their own family of regulators—and they may want to share data with each other, which is something that is subject to the Information Commission’s Office again. So, again, if we are going to give platforms instructions, which we rightly do in this legislation, and say, “Look, we want you to get tougher on online fraudsters; we want you to demonstrate a duty of care there”, the platforms will need—certainly those regulators: financial regulators, Ofcom and the Information Commissioner’s Office—to sort those things out.
Having a forum such as the one proposed in Amendment 201, where these really difficult issues can be thrashed out and clear guidance can be given to online services, will be much more efficient than what sometimes happened in the past, where you had the left hand and the right hand of the regulatory world pulling you in different directions. I know that we have the Digital Regulation Cooperation Forum. If we can build on those institutions, it is essential and ideal that they have their input before the guidance is issued, rather than have a platform comply with guidance from regulator A and then get dinged by regulator B for doing the thing that they have been instructed to do.
That leads to the very sensible Amendment 201 on skilled persons. Again, Ofcom is going to be able to call in skilled persons. In an area such as data protection, that might be a data protection lawyer, but, equally, it might be that somebody who works at the Information Commissioner’s Office is actually best placed to give advice. Amendment 200—the first of the two that talks about skilled persons being able to come from regulators—makes sense.
Finally, I will touch on the issues raised in Amendment 268AA—I listened carefully and understand that it is a probing amendment. It raises some quite fundamental questions of principle—I suspect that the noble Baroness, Lady Fox, might want to come in on these—and it has been dealt with in the context of Germany and its network enforcement Act: I know the noble Lord, Lord Parkinson of Whitley Bay, can say that in the original German. That Act went in the same direction, motivated by similar concerns around hate speech.
My Lords, so few of us are involved in this discussion that we are now able to write each other’s speeches. I thank the noble Lord, Lord Allan of Hallam, for articulating some of my concerns, probably more elegantly than I will myself. I will focus on two amendments in this group; in fact, there are lots of interesting things, but I will focus on both the amendments from the noble Lord, Lord Bassam of Brighton.
On the issue of proactive steps to remove listings of knives for young people, I am so sympathetic to this because in a different area of my life I am pretty preoccupied with the problem of knife crime among young people. It really bothers me and I worry about how we tackle it. My concern of course is that the police should be working harder to solve that problem and that we cannot anticipate that the Bill will solve all social problems. There is a danger of removing the focus from law enforcement in a real-world problem, as though removing how you buy the knife is the issue. I am not convinced that that helps us.
I wanted to reflect on the kind of dilemmas I am having around this in relation to the story of Mizzy that is doing the rounds. He is the 18 year-old who has been posting his prank videos on TikTok and has caused quite a stir. People have seen him wandering into strangers’ homes uninvited, asking random people in the street if they want to die, running off with an elderly lady’s dog and making fun of Orthodox Jews—generally speaking, this 18 year-old is obnoxious. His TikTok videos have gone viral; everybody is discussing them.
This cruelty for kicks genre of filming yourself, showing your face full to the camera and so on, is certainly abhorrent but, as with the discussion about knife crime, I have noticed that some people outside this House are attempting to blame the technology for the problem, saying that the videos should have been removed earlier and that it is TikTok’s fault that we have this anti-social behaviour, whereas I think it is a much deeper, broader social problem to do with the erosion of adult authority and the reluctance of grown-ups to intervene clearly when people are behaving badly—that is my thesis. It is undoubtedly a police matter. The police seem to have taken ages to locate Mizzy. They eventually got him and charged him with very low offences, so he was on TV being interviewed the other evening, laughing at how weak the law was. Under the laws he was laughing at, he could freely walk into somebody’s house or be obnoxious and get away with it. He said, “We can do what we want”. That mockery throws up problems, but I do not necessarily think that the Bill is the way to solve it.
That leads me to my concerns about Amendment 268AA, because Mizzy was quoted in the Independent newspaper as saying:
“I’m a Black male doing these things and that’s why there’s such an uproar”.
I then went on a social media thread in which any criticism of Mizzy’s behaviour was described as racist harassment. That shows the complexity of what is being called for in Amendment 268AA, which wants platforms to take additional steps
“to combat incidents of online racially aggravated harassment”.
My worry is that we end up with not only Mizzy’s TikTok videos being removed but his critics being removed for racially harassing him, so we have to be very careful here.
Amendment 268AA goes further, because it wants tech companies to push for prosecution. I really think it is a dangerous step to encourage private companies to get tangled up in deciding what is criminal and so on. The noble Lord, Lord Allan, has exactly described my concerns, so I will not repeat them. Maybe I can probe this probing amendment. It also broadens the issue to all forms of harassment.
By the way, the amendment’s explanatory statement mentions the appalling racist abuse aimed at footballers and public figures, but one of the fascinating things was that when we number-crunched and went granular, we found that the majority of that racist abuse seemed to have been generated by bots, which takes us to the position of the noble Lord, Lord Knight, earlier: who would you prosecute in that instance? Bots not even based in the UK were generating what was assumed to be an outbreak of racist abuse among football fans in the UK, but the numbers did not equate to that. There were some people being racist and vile and some things that were generated in these bot farms.
To go back to the amendment, it goes on to broaden the issue out to
“other forms of harassment and threatening or abusive behaviour”.
Again, this is much more complicated in today’s climate, because those kinds of accusation can be deployed for bad faith reasons, particularly against public figures.
We have an example close to this House. I hope that Members have been following and will show solidarity over what has been happening to the noble Baroness, Lady Falkner of Margravine, who is chair of the Equality and Human Rights Commission and tasked with upholding the equality law but is at the centre of a vicious internal row after her officials filed a dossier of complaints about her. They have alleged that she is guilty of harassment. A KC is being brought in, there are 40 complaints and the whole thing is costing a fortune for both taxpayers and the noble Baroness herself.
It coincided with the noble Baroness, Lady Falkner, advising Ministers to update the definition of sex in the Equality Act 2010 to make clear that it refers to biological sex and producing official advice clarifying that trans women can be lawfully excluded from female-only spaces. We know how toxic that whole debate is.
Many of us feel that a lot of the accusations against the noble Baroness are ideologically and politically motivated vexatious complaints. I am distressed to read newspaper reports that say that she has been close to tears and has asked why anyone would go into public service. All this is for the crime of being a regulator upholding and clarifying the law. I hope it does not happen to the person who ends up regulating Ofcom—ending up close to tears as he stands accused of harassment, abusive behaviour and so on.
The point is that she is the one being accused of harassment. I have seen the vile abuse that she has received online. It is completely defamatory, vicious abuse and yet somehow it ends up being that, because she does not provide psychological safety at work and because of her views, she is accused of harassment and is the one in the firing line. I do not want us to introduce that kind of complexity—this is what I have been worried about throughout—into what is banned, removed or sent to the police as examples of harassment or hate crime.
I know that is not the intention of these amendments; it is the unintended consequences that I dread.
My Lords, I will speak chiefly to Amendment 262 in my name, although in speaking after the noble Baroness, Lady Fox, who suggested that the grown-ups should control anti-social behaviour by young people online, I note that there is a great deal of anti-social behaviour online from people of all ages. This is relevant to my Amendment 262.
It is a very simple amendment and would require the Secretary of State to consult with young people by means of an advisory board consisting of people aged 25 and under when reviewing the effectiveness and proportionality of this legislation. This amendment is a practical delivery of some of the discussion we had earlier in this Committee when we were talking about including the Convention on the Rights of the Child in the Bill. There is a commonly repeated phrase, “Nothing about us without us”. It was popularised by disability activists in the 1990s, although in doing a little research for this I found that it originates in Latin in Poland in the 15th century. So it is an idea that has been around for a long while and is seen as a democratic standard. It is perhaps a variation of the old “No taxation without representation”.
This suggestion of an advisory board for the Secretary of State is because we know from the discussion earlier on the children’s rights amendments that globally one in three people online is a child under the age of 18. This comes to the point of the construction of your Lordships’ House. Most of us are a very long way removed in experiences and age—some of us further than others. The people in this Committee thinking about a 12 year-old online now are parents, grandparents and great-grandparents. I venture to say that it is very likely that the Secretary of State is at least a generation older than many of the people who will be affected by its provisions.
This reflects something that I also did on the Health and Care Bill. To introduce an advisory panel of young people reporting directly to the Secretary of State would ensure a direct voice for legislation that particularly affects young people. We know that under-18s across the UK do not have any role in elections to the other place, although 16 and 17 year-olds have a role in other elections in Wales and Scotland now. This is really a simple, clear, democratic step. I suspect the Minister might be inclined to say, “We are going to talk to charities and adults who represent children”. I suggest that what we really need here is a direct voice being fed in.
I want to reflect on a recent comment piece in the Guardian that made a very interesting argument: that there cannot be, now or in the future, any such thing as a digital native. Think of the experience of someone 15 or 20 years ago; yes, they already had the internet but it was a very different beast to what we have now. If we refer back to some of the earlier groups, we were starting to ask what an internet with widespread so-called generative artificial intelligence would look like. That is an internet which is very different from even the one that a 20 year-old is experiencing now.
It is absolutely crucial that we have that direct voice coming in from young people with experience of what it is like. They are an expert on what it is like to be a 12 year-old, a 15 year-old or a 20 year-old now, in a way that no one else can possibly be, so that is my amendment.
My Lords, this is the most miscellaneous of all the groups that we have had, so it has rightly been labelled as such—and the competition has been pretty strong. I want to come back to the amendments of the noble Lord, Lord Stevenson, and of the noble Lord, Lord Bassam, but first I want to deal with my Amendments 200 and 201 and to put on the record the arguments there.
Again, if I refer back to our joint report, we were strongly of the view—alongside the Communications and Digital Committee—that there should be a statutory requirement for regulators
“to cooperate and consult with one another”.
Although we welcomed the formation of the DRCF, it seemed to us that there should be a much firmer duty. I was pleased to hear the examples that my noble friend put forward of the kinds of co-operation that will be needed. The noble Baroness, Lady Morgan, clearly understands that, particularly in the area of fraud, it could be the FCA or ICO, and it could be Ofcom in terms in social media. There is a range of aspects to this—it could be the ASA.
These bodies need to co-operate. As my noble friend pointed out, they can apparently conflict; therefore, co-operating on the way that they advise those who are subject to regulation is rather important. It is not just about the members of the Digital Regulation Cooperation Forum. Even the IWF and the ASA could be included in that, not to mention other regulators in this analogous space. That forum has rightly been labelled as “Digital”, and digital business is now all-pervasive and involves a huge number of regulatory aspects.
Although in this context Ofcom will have the most relevant powers and expertise, and many regulators will look to it for help in tackling online safety issues, effective public protection will be achieved through proper regulatory co-operation. Therefore, Ofcom should be empowered to co-operate with others to share information. As much as it can, Ofcom should be enabled to work with other regulators and share online safety information with them.
It has been very heartening to see the noble Lord, Lord Grade, in his place, even on a Thursday afternoon, and heartening how Ofcom has engaged throughout the passage of the Bill. We know the skills that it is bringing on board, and with those skills we want it to bring other regulators into its work. It seems that Ofcom is taking the lead on those algorithmic understanding skills, but we need Ofcom to have the duty to co-operate with the other regulators on this as well.
Strangely, in Clause 103 the Bill gives Ofcom the general ability to co-operate with overseas regulators, but it is largely silent on co-operation with UK regulators. Indeed, the Communications Act 2003 limits the UK regulators with which Ofcom can share information, excluding the ICO, for example, which is rather perverse in these circumstances. However, the Bill has a permissive approach to overseas regulators so, again, it should extend co-operation and information-sharing in respect of online safety to include regulators overseeing the offences in Schedule 7 that we have spent some time talking about today—the enforcement authorities, for instance, those responsible for enforcing the offences in relation to priority harms to children and priority offences regarding adults. Elsewhere in regulation, the Financial Conduct Authority may have a general duty to co-operate. The reverse may be true, so that duty of co-operation will need to work both ways.
As my noble friend Lord Allan said, Amendment 200, the skilled persons provision, is very straightforward. It is just to give the formal power to be able to use the expertise from a different regulator. It is a very well-known procedure to bring skilled persons into inquiries, which is exactly what is intended there.
Both amendments tabled by the noble Lord, Lord Bassam, are rather miscellaneous too, but are not without merit, particularly Amendment 185A. Please note that I agree with the noble Baroness, Lady Fox. I 100% support the intention behind the amendment but wonder whether the Bill is the right vehicle for it. No doubt the Minister will answer regarding the scope and how practical it would be. I absolutely applaud the noble Lord for campaigning on this issue. It is extraordinarily important, because we have seen some tragic outcomes of these weapons being available for sale online.
Amendment 268AA, also tabled by the noble Lord, Lord Bassam, is entirely different. Our Joint Committee heard evidence from Edleen John of the FA and Rio Ferdinand about abuse online. It was powerful stuff. I tend to agree with my noble friend. We have talked about user empowerment, the tools for it and, particularly in the context of violence against women and girls, the need for a way to be able to report that kind of abuse or other forms of content online. This is a candidate for that kind of treatment. While platforms obviously need to prevent illegal content and have systems to prevent it and so on, having assessed risk in the way that we have heard about previously, I do not believe that expecting the platforms to pick it up and report it, turning them into a sort of proto-enforcer, is the most effective way. We have to empower users. I absolutely share the objectives set out.
My Lords, when I brought an amendment to a police Bill, my local football club said to me that it was anticipating spending something like £100,000 a year trying to create and develop filters, which were commercially available, to stop its footballers being able to see the abuse that they were getting online. It did that for a very sensible commercial reason because those footballers’ performance was affected by the abuse they got. I want to know how the noble Lord sees this working if not by having some form of intervention that involves the platforms. Obviously, there is a commercial benefit to providers of filters et cetera, but it is quite hard for those who have been victims to see a way to make this useful to them without some external form of support.
I absolutely take what the noble Lord is saying, and I am not saying that the platforms do not have responsibility. Of course they do: the whole Bill is about the platforms taking responsibility with risk assessment, adhering to their terms of service, transparency about how those terms are operating, et cetera. It is purely on the question of whether they need to be reporting that content when it occurs. They have takedown responsibilities for illegal content or content that may be seen by children and so on, but it is about whether they have the duty to report to the police. It may seem a relatively narrow point, but it is quite important that we go with the framework. Many of us have said many times that we regret the absence of “legal but harmful” but, given where we are, we basically have to go with that architecture.
I very much enjoyed listening to the noble Baroness, Lady Bennett. There is no opportunity lost in the course of the Bill to talk about ChatGPT or GPT-4, and that was no exception. It means that we need to listen to how young people are responding to the way that this legislation operates. I am fully in favour of whatever mechanism it may be. It does not need to be statutory, but I very much hope that we do not treat this just as the end of the process but will see how the Bill works out and will listen and learn from experience, and particularly from young people who are particularly vulnerable to much of the content, and the way that the algorithms on social media work.
I am so sorry. With due respect to the noble Lord, Lord Stevenson, the noble Baroness, Lady Bennett, reminded me that his Amendments 202ZA and 210A, late entrants into the miscellaneous group, go very much with the grain that we are trying to get in within the area of encryption. We had quite a long debate about encryption on Clause 110. As ever, the noble Lord has rather cunningly produced something that I think will get us through the eye of the free speech needle. They are two very cunning amendments.
I thank the noble Lord for that. Free expression, my Lords, not free speech.
Yes, freedom of expression. That is right.
I will start where the noble Lord, Lord Clement-Jones, finished, although I want to come back and cover other things. This is a very complicated group. I do not think we can do it quickly, as each issue is important and is worth trying to take forward.
My Lords, this has been miscellany, indeed. We must be making progress if we are picking up amendments such as these. I thank noble Lords who have spoken to the amendments and the issues covered in them.
I turn first to Amendment 185A brought to us by the noble Lord, Lord Bassam of Brighton, which seeks to add duties on online marketplaces to limit children’s access to the sale of knives, and proactively to identify and remove listings which appear to encourage the sale of knives for the purposes of violence or self-harm. Tackling knife crime is a priority for His Majesty’s Government; we are determined to crack down on this violent scourge, which is devastating our communities. I hope that he will forgive me for not drawing on the case he mentioned, as it is still sub judice. However, I certainly take the point he makes; we are all too aware of cases like it up and down the country. I received an email recently from Amanda and Stuart Stephens, whose son, Olly, was murdered by two boys, one of whom was armed with a knife. All these cases are very much in our minds as we debate the Bill.
Let me try to reassure them and the noble Lord as well as other Members of the Committee that the Bill, through its existing duties and other laws on the statute book, already achieves what the noble Lord seeks with his amendment. The sale of offensive weapons and of knives to people under the age of 18 are criminal offences. Any online retailer which directly sells these prohibited items can already be held criminally liable. Once in force, the Bill will ensure that technology platforms, including online marketplaces, prevent third parties from using their platform to sell offensive weapons or knives to people under the age of 18. The Bill lists both these offences as priority offences, meaning that user-to-user services, including online marketplaces, will have a statutory obligation proactively to prevent these offences taking place on their services.
I am sorry to interrupt. The Minister has twice given a positive response, but he limited it to child sexual exploitation; he did not mention terrorism, which is in fact the bigger issue. Could he confirm that it is both?
Yes, and as I say, I am happy to talk with the noble Lord about this in greater detail. Under the Bill, category 1 companies will have a new duty to safeguard all journalistic content on their platform, which includes citizen journalism. But I will have to take all these points forward with him in our further discussions.
My noble friend Lord Bethell is not here to move his Amendment 220D, which would allow Ofcom to designate online safety regulatory duties under this legislation to other bodies. We have previously discussed a similar issue relating to the Internet Watch Foundation, so I shall not repeat the points that we have already made.
On the amendments on supposedly gendered language in relation to Ofcom advisory committees in Clauses 139 and 155, I appreciate the intention to make it clear that a person of either sex should be able to perform the role of chairman. The Bill uses the term “chairman” to be consistent with the terminology in the Office of Communications Act 2002, and we are confident that this will have no bearing on Ofcom’s decision-making on who will chair the advisory committees that it must establish, just as, I am sure, the noble Lord’s Amendment 56 does not seek to be restrictive about who might be an “ombudsman”.
I appreciate the intention of Amendment 262 from the noble Baroness, Lady Bennett of Manor Castle. It is indeed vital that the review reflects the experience of young people. Clause 159 provides for a review to be undertaken by the Secretary of State, and published and laid before Parliament, to assess the effectiveness of the regulatory framework. There is nothing in the existing legislation that would preclude seeking the views of young people either as part of an advisory group or in other ways. Moreover, the Secretary of State is required to consult Ofcom and other persons she considers appropriate. In relation to young people specifically, it may be that a number of different approaches will be effective—for example, consulting experts or representative groups on children’s experiences online. That could include people of all ages. The regulatory framework is designed to protect all users online, and it is right that we take into account the full spectrum of views from people who experience harms, whatever their age and background, through a consultation process that balances all their interests.
Amendment 268AA from the noble Lord, Lord Bassam, relates to reporting requirements for online abuse and harassment, including where this is racially motivated—an issue we have discussed in Questions and particularly in relation to sport. His amendment would place an additional requirement on all service providers, even those not in scope of the Bill. The Bill’s scope extends only to user-to-user and search services. It has been designed in this way to tackle the risk of harm to users where it is highest. Bringing additional companies in scope would dilute the efforts of the legislation in this important regard.
Clauses 16 and 26 already require companies to set up systems and processes that allow users easily to report illegal content, including illegal online abuse and harassment. This amendment would therefore duplicate this existing requirement. It also seeks to create an additional requirement for companies to report illegal online abuse and harassment to the Crown Prosecution Service. The Bill does not place requirements on in-scope companies to report their investigations into crimes that occur online, other than child exploitation and abuse. This is because the Bill aims to prevent and reduce the proliferation of illegal material and the resulting harm it causes to so many. Additionally, Ofcom will be able to require companies to report on the incidence of illegal content on their platforms in its transparency reports, as well as the steps they are taking to tackle that content.
I hope that reassures the noble Lord that the Bill intends to address the problems he has outlined and those explored in the exchange with the noble Lord, Lord Clement-Jones. With that, I hope that noble Lords will support the government amendments in this group and be satisfied not to press theirs at this point.
My Lords, I listened very carefully to the Minister’s response to both my amendments. He has gone some way to satisfying my concerns. I listened carefully to the concerns of the noble Baroness, Lady Fox, and noble Lords on the Lib Dem Benches. I am obviously content to withdraw my amendment.
I do not quite agree with the Minister’s point about dilution on the last amendment—I see it as strengthening —but I accept that the amendments themselves slightly stretch the purport of this element of the legislation. I shall review the Minister’s comments and I suspect that I shall be satisfied with what he said.
My Lords, I am very grateful to the noble Baronesses, Lady Parminter and Lady Deech, and the noble Lord, Lord Mann, for their support. After a miscellaneous selection of amendments, we now come back to a group of quite tight amendments. Given the hour, those scheduling the groupings should be very pleased because for the first time we have done all the groups that we set out to do this afternoon. I do not want to tempt fate, but I think we will have a good debate before we head off for a little break from the Bill for a while.
My Lords, I will speak to Amendment 192A. There can be nothing more comfortable within the terms of parliamentary debate than to find oneself cossetted by the noble Baroness, Lady Morgan, on one side and my noble friend Lord Stevenson on the other. I make no apology for repeating the thrust of the argument of the noble Baroness, but I will narrow the focus to matters that she hinted at which we need to think about in a particular way.
We have already debated suicide, self-harm and eating disorder content hosted by category 1 providers. There is a need for the Bill to do more here, particularly through strengthening the user empowerment duties in Clause 12 so that the safest option is the default. We have covered that ground. This amendment seeks to address the availability of this content on smaller services that will fall outside category 1, as the noble Baroness has said. The cut-off conditions under which services will be determined to fall within category 1 are still to be determined. We await further progress on that. However, there are medium-sized and small providers whose activities we need to look at. It is worth repeating—and I am aware that I am repeating—that these include suicide and eating disorder forums, whose main business is the sharing and discussion of methods and encouragement to engage in these practices. In other words, they are set up precisely to do that.
We know that that there are smaller platforms where users share detailed information about methods of suicide. One of these in particular has been highlighted by families and coroners as playing a role in the suicides of individuals in the UK. Regulation 28 reports—that is, an official request for action—have been issued to DCMS and DHSC by coroners to prevent future comparable deaths.
A recent systematic review, looking at the impact of suicide and self-harm-related videos and photographs, showed that potentially harmful content concentrated specifically on sites with low levels of moderation. Much of the material which promotes and glorifies this behaviour is unlikely to be criminalised through the Government’s proposed new offence of encouragement to serious self-harm. For example, we would not expect all material which provides explicit instructional information on how to take one’s life using novel and effective methods to be covered by it.
The content has real-world implications. There is clear evidence that when a particular suicide method becomes better known, the effect is not simply that suicidal people switch from one intended method to the novel one, but that suicides occur in people who would not otherwise have taken their own lives. There are, therefore, important public health reasons to minimise the discussion of dangerous and effective suicide methods.
The Bill’s pre-legislative scrutiny committee recommended that the legislation
“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.
This amendment is in line with that recommendation, seeking to extend category 1 regulation to services that carry a high level of risk.
The previous Secretary of State appeared to accept this argument—but we have had a lot of Secretaries of State since—and announced a deferred power that would have allowed for the most dangerous forums to be regulated; but the removal of the “legal but harmful” provisions from the legislation means that this power is no longer applicable, as its function related to the “adult risk assessment” duty, which is no longer in the Bill.
This amendment would not shut down dangerous services, but it would make them accountable to Ofcom. It would require them to warn their users of what they were about to see, and it would require them to give users control over the type of content that they see. That is, the Government’s proposed triple shield would apply to them. We would expect that this increased regulatory burden on small platforms would make them more challenging to operate and less appealing to potential users, and would diminish their size and reach over time.
This amendment is entirely in line with the Government’s own approach to dangerous content. It simply seeks to extend the regulatory position that they themselves have arrived at to the very places where much of the most dangerous content resides. Amendment 192A is supported by the Mental Health Foundation, the Samaritans and others that we have been able to consult. It is similar to Amendment 192, which we also support, but this one specifies that the harmful material that Ofcom must take account of relates to self-harm, suicide and eating disorders. I would now be more than happy to give way—eventually, when he chooses to do it—to my noble friend Lord Stevenson, who is not expected at this moment to use the true and full extent of his abilities at being cunning.
My Lords, I rise to offer support for all the amendments in this group, but I will speak principally to Amendment 192A, to which I have added my name and which the noble Lord, Lord Griffiths, has just explained so clearly. It is unfortunate that the noble Baroness, Lady Parminter, cannot be in her place today. She always adds value in any debate, but on this issue in particular I know she would have made a very compelling case for this amendment. I will speak principally about eating disorders, because the issues of self-harm have already been covered and the hour is already late.
The Bill as it stands presumes a direct relationship between the size of a platform and its potential to cause harm. This is simply not the case: a systematic review which we heard mentioned confirmed what all users of the internet already know—that potentially harmful content is often and easily found on smaller, niche sites that will fall outside the scope of category 1. These sites are absolutely not hard to find—they come up on the first page of a Google search—and some hide in plain sight, masquerading, particularly in the case of eating disorder forums, as sources of support, solace or factual information when in fact they encourage and assist people towards dangerous practices. Without this amendment, those sites will continue spreading their harm and eating disorders will continue to have the highest mortality rate of all mental illnesses in the UK.
My Lords, I am a poor substitute for the noble Baroness, Lady Parminter, in terms of the substance of the issues covered by these amendments, but I am pleased that we have been able to hear from the noble Baroness, Lady Bull, on that. I will make a short contribution on the technology and the challenges of classification, because there are some important issues here that the amendments bring out.
We will be creating rules for categorising platforms. As I understand it, the rules will have a heavy emphasis on user numbers but will not be exclusively linked to user numbers. It would be helpful if the Minister could tease out a little more about how that will work. However, it is right even at this stage to consider the possibility that there will need to be exceptions to those rules and to have a mechanism in place for that.
We need to recognise that services can grow very quickly these days, and some of the highest-risk moments may be those when services have high growth but still very little revenue and infrastructure in place to look after their users. This is a problem generally with stepped models, where you have these great jumps; in a sense, a sliding scale would be more rational, so that responsibilities increase over time, but clearly from a practical view it is hard to do that, so we are going to end up with some kind of step model.
We also need to recognise that, from a technical point of view, it is becoming cheaper and easier to build new user-to-user services all the time. That has been the trend for years, but it is certainly the case now. If someone wants to create a service, they can rent the infrastructure from a number of providers rather than buying it, they can use a lot of code that is freely available—they do not need to write as much code as they used to—and they can promote their new service using all the existing social networks, so you can go from zero to significant user numbers in very quick time, and that is getting quicker all the time. I am interested to hear how the Minister expects such services to be regulated.
The noble Baroness, Lady Morgan, referred to niche platforms. There will be some that have no intention to comply, even if we categorise them as a 2B service. The letter will arrive from Ofcom and go in the bin. They will have no interest whatever. Some of the worst services will be like that. The advantage of us ensuring that we bring them into scope is that we can move through the enforcement process quickly and get to business disruption, blocking, or whatever we need to do to get them out of the UK market. Other niche services will be willing to come into line if they are told they are categorised as 2B but given a reasonable set of requirements. Some of Ofcom’s most valuable work might be precisely to work with them: services that are borderline but recognise that they want to have a viable business, and they do not have a viable business by breaking the law. We need to get hold of them and bring them into the net to be able to work with them.
Finally, there is another group which is very mainstream but in the growing phase and busy growing and not worrying about regulation. For that category of company, we need to work with them as they grow, and the critical thing is to get them early. I think the amendments would help Ofcom to be able get to them early—ideally, in partnership with other regulators, including the European Union, which is now regulating in a similar way under the Digital Services Act. If we can work with those companies as they come into 2B, then into category 1—in European speak, that is a VLOP, a very large online platform—and get them used to the idea that they will have VLOP and category 1 responsibilities before they get there, we can make a lot more progress. Then we can deliver what we are all trying to, which is a safer internet for people in the UK
I shall speak very briefly at this hour, just to clarify as much as anything. It seems important to me that there is a distinction between small platforms and large platforms, but my view has never been that if you are small, you have no potential harms, any more than if you are large, you are harmful. The exception should be the rule. We have to be careful of arbitrary categorisation of “small”. We have to decide who is going to be treated as though they are a large category 1 platform. I keep saying but stress again: do not assume that everybody agrees what significant risk of harm or hateful content is. It is such highly disputed political territory outside the online world and this House that we must recognise that it is not so straightforward.
I am very sympathetic, by the way, to the speeches made about eating disorders and other issues. I see that very clearly, but other categories of speech are disputed and argued over—I have given loads of examples. We end up where it is assumed that the manifestoes of mass shooters appear on these sites, but if you read any of those manifestoes of mass shooters, they will often be quoting from mainstream journalists in mainstream newspapers, the Bible and a whole range of things. Just because they are on 4Chan, or wherever, is not necessarily the problem; it is much more complicated.
I ask the Minister, and the proposers of the amendment, to some extent: would it not be straightforwardly the case that if there is a worry about a particular small platform, it might be treated differently—
I just want to react to the manifestos of mass shooters. While source material such the Bible is not in scope, I think the manifesto of a shooter is clear incitement to terrorism and any platform that is comfortable carrying that is problematic in my view, and I hope it would be in the noble Baroness’s view as well.
I was suggesting that we have a bigger problem than it appearing on a small site. It quotes from mainstream media, but it ends up being broadly disseminated and not because it is on a small site. I am not advocating that we all go round carrying the manifestos of mass shooters and legitimising them. I was more making the point that it can be complicated. Would not the solution be that you can make appeals that a small site is treated differently? That is the way we deal with harmful material in general and the way we have dealt with, for example, RT as press without compromising on press freedom. That is the kind of point I am trying to make.
I understand lots of concerns but I do not want us to get into a situation where we destroy the potential of all smaller platforms—many of them doing huge amounts of social good, part of civil society and all the rest of it—by treating them as though they are large platforms. They just will not have the resources to survive, that is all my point is.
My Lords, I am going to be extremely brief given the extremely compelling way that these amendments have been introduced by the noble Baroness, Lady Morgan, and the noble Lord, Lord Griffiths, and contributed to by the noble Baroness, Lady Bull. I thank her for her comments about my noble friend Lady Parminter. I am sure she would have wanted to be here and would have made a very valuable contribution as she did the other day on exactly this subject.
As the noble Baroness, Lady Fox, has illustrated, we have a very different view of risk across this Committee and we are back, in a sense, into that whole area of risk. I just wanted to say that I think we are again being brought back to the very wise words of the Joint Committee. It may sound like special pleading. We keep coming back to this, and the noble Lord, Lord Stevenson, and I are the last people standing on a Thursday afternoon.
We took a lot of evidence in this particular area. We took the trouble to go to Brussels and had a very useful discussion with the Centre on Regulation in Europe and Dr Sally Broughton Micova. We heard a lot about interconnectedness between some of these smaller services and the impact in terms of amplification across other social media sites.
We heard in the UK from some of the larger services about their concerns about the activities of smaller services. You might say “They would say that, wouldn’t they?” but they were pretty convincing. We heard from HOPE not Hate, the Antisemitism Policy Trust and Stonewall, stressing the role of alternative services.
Of course, we know that these amendments today—some of them sponsored by the Mental Health Foundation, as the noble Lord, Lord Griffiths, said, and Samaritans—have a very important provenance. They recognise that these are big problems. I hope that the Minister will think strongly about this. The injunction from the noble Lord, Lord Allan, to consider how all this is going to work in practice is very important. I very much hope that when we come to consider how this works in practical terms that the Minister will think very seriously about the way in which risk is to the fore— the more nuanced approach that we suggested—and the whole way that profiling by Ofcom will apply. I think that is going to be extremely important as well. I do not think we have yet got to the right place in the Bill which deals with these risky sites. I very much hope that the Minister will consider this in the quite long period between now and when we next get together.
My Lords, this has been a good little debate with some excellent speeches, which I acknowledge. Like the noble Lord, Lord Clement-Jones, I was looking at the Joint Committee’s report. I concluded that one of the first big issues we discussed was how complicated the categorisation seemed in relation to the task that was being set for Ofcom. We comforted ourselves with the thought that if you believe that this is basically a risk-assessment exercise and that all the work Ofcom will subsequently do is driven by its risk assessments and its constant reviewing of them, then the categorisation is bound to fall down because the risks will reveal the things that need to happen.
I am grateful to noble Lords for helping us to reach our target for the first time in this Committee, especially to do so in a way which has given us a good debate on which to send us off into the Whitson Recess. I am off to the Isle of Skye, so I will make a special detour to Balmacara in honour of the noble Lord.
The noble Lord does not believe anything that I say at this Dispatch Box, but I will send a postcard.
As noble Lords are by now well aware, all services in scope of the Bill, regardless of their size, will be required to take action against illegal content and all services likely to be accessed by children must put in place protections for children. Companies designated as category 1 providers have significant additional duties. These include the overarching transparency, accountability and freedom of expression duties, as well as duties on content of democratic importance, news publishers’ content, journalistic content and fraudulent advertising. It is right to put such duties only on the largest platforms with features enabling the greatest reach, as they have the most significant influence over public discourse online.
I turn first to Amendment 192 in the name of my noble friend Lady Morgan of Cotes and Amendment 192A from the noble Lord, Lord Griffiths of Burry Port, which are designed to widen category 1 definitions to include services that pose a risk of harm, regardless of their number of users. Following removal of the legal but harmful provisions in another place, the Bill no longer includes the concept of risk of harm in Category 1 designation. As we set out, it would not be right for the Government to define what legal content it considers harmful to adults, and it follows that it would not be appropriate for the Government to categorise providers and to require them to carry out duties based on this definition.
In addition, requiring all companies to comply with the full range of Category 1 duties would pose a disproportionate burden on services which do not exert the same influence over public discourse online. I appreciate the point made by the noble Baroness, Lady Bull, with regard to regulatory burden. There is a practical element to this as well. Services, particularly smaller ones, have finite resources. Imposing additional duties on them would divert them from complying with their illegal and child safety duties, which address the most serious online harms. We do not want to weaken their ability to tackle criminal activity or to protect children.
As we discussed in detail in a previous debate, the Bill tackles suicide and self-harm content in a number of ways. The most robust protections in the Bill are for children, while those for adults strike a balance between adults being protected from illegal content and given more choice over what legal content they see. The noble Lord, Lord Stevenson, asked why we do not start with the highest risk rather than thinking about the largest services, but we do. We start with the most severe harms—illegal activity and harm to children. We are focusing on the topics of greatest risk and then, for other categories, allowing adults to make decisions about the content with which they interact online.
A number of noble Lords referred to suicide websites and fora. We are concerned about the widespread availability of content online which promotes and advertises methods of suicide and self-harm, which can be easily accessed by young or vulnerable people. Under the Bill, where suicide and self-harm websites host user-generated content, they will be in scope of the legislation. These sites will need proactively to prevent users from being exposed to priority illegal content, including content which encourages or assists suicide under the terms of the Suicide Act 1961. Additionally, it is an offence under Section 4(3) of the Misuse of Drugs Act 1971 for a website to offer to sell controlled drugs to consumers in England and Wales. Posting advice on how to obtain such drugs in England and Wales is also likely to be an offence, regardless of where the person providing the advice is located.
The Bill also limits the availability of such content by placing illegal content duties on search services, including harmful content which affects children or where this content is shared on user-to-user services. This will play a key role in reducing traffic that directs people to websites which encourage or assist suicide, and reduce the likelihood of users encountering such content. The noble Baroness, Lady Bull, asked about starvation. Encouraging people to starve themselves or not to take prescribed medication will be covered.
Amendment 194 tabled by the noble Lord, Lord Stevenson of Balmacara, seeks to ensure that Ofcom can designate companies as category 1, 2A or 2B on a provisional basis, when it considers that they are likely to meet the relevant thresholds. This would mean that the relevant duties can be applied to them, pending a full assessment by Ofcom. The Government recognise the concern highlighted by the noble Lord, Lord Allan, about the rapid pace of change in the technology sector and how that can make it challenging to keep the register of the largest and most influential services up to date. I assure noble Lords that the Bill addresses this with a duty which the Government introduced during the Bill’s recommittal in another place. This duty, at Clause 88, requires Ofcom proactively to identify and publish a list of companies which are close to category 1 thresholds. This will reduce any delays in Ofcom adding additional obligations on companies which grow rapidly, or which introduce new high-risk features. It will also ensure that the regime remains agile and adaptable to emerging threats.
Platforms with the largest reach and greatest influence over public discourse will be designated as category 1. The Bill sets out a clear process for determining category 1 providers, based on thresholds relating to these criteria, which will be set by the Secretary of State in secondary legislation. The process has been designed to ensure that it is transparent and evidence-based. We expect the main social media platforms and possibly some others to be designated as category 1 services, but we do not wish to prejudge the process set out above by indicating which specific services are likely to be designated, as I have set out on previous groups.
The amendment would enable Ofcom to place new duties on companies without due process. Under the approach that we take in the Bill, Ofcom can designate companies as belonging to each category based only on an objective assessment of evidence against thresholds approved by Parliament. The Government’s approach also provides greater certainty for companies, as is proposed in this amendment. We have heard concerns in previous debates about when companies will have the certainty of knowing their category designation. These amendments would introduce continuous uncertainty and subjectivity into the designation process and would give Ofcom significant discretion over which companies should be subject to which duties. That would create a very uncertain operating environment for businesses and could reduce the attractiveness of the UK as a place to do business.
I hope that explains why we are not taken by these amendments but, in the spirit of the Whitsun Recess, I will certainly think about them on the train as I head north. I am very happy to discuss them with noble Lords and others between now and our return.
Before the Minister sits down, he did let slip that he was going on the sleeper, so I do not think that there will be much thinking going on—although I did not sleep a wink the last time I went, so I am sure that he will have plenty of time.
I am sure that the noble Baroness, Lady Morgan, will want to come in—but could he repeat that again? Risk assessment drives us, but the risk assessment for a company that will not be regarded as a category 1 provider because it does not meet categorisation thresholds means that, even though it is higher risk than perhaps even some of the category 1 companies, it will not be subject to the requirements to pick up the particular issues raised by the noble Baroness and the noble Lord, and their concerns for those issues, which are clearly social harms, will not really be considered on a par.
In the response I gave, I said that we are making the risk assessment that the riskiest behaviour is illegal content and content which presents a harm to children. That is the assessment and the approach taken in the Bill. In relation to other content which is legal and for adults to choose how they encounter it, there are protections in the Bill to enforce terms of service and empower users to curate their own experience online, but that assessment is made by adult users within the law.
I thank all noble Lords who spoke in this short but important debate. As we heard, some issues relating to risk and harm have been returned to and will no doubt be again, and we note the impact of the absence of legal but harmful as a concept. As the noble Baroness, Lady Bull, said, I know that the noble Baroness, Lady Parminter, was very sad that she could not be here this afternoon due to another engagement.
I will not keep the House much longer. I particularly noted the noble Baroness’s point that there should not be, and is not, a direct relationship between the size of the platform and its ability to cause harm. There is a balance to be struck between the regulatory burden placed on platforms versus the health and well-being of those who are using them. As I have said before, I am not sure that we have always got that particular balance right in the Bill.
The noble Lord, Lord Allan, was very constructive: it has to be a good thing if we are now beginning to think about the Bill’s implementation, although we have not quite reached the end and I do not want to prejudge any further stages, in the sense that we are now thinking about how this would work. Of course, he is right to say that some of these platforms have no intention of complying with these rules at all. Ofcom and the Government will have to work out what to do about that.
Ultimately, the Government of the day—whoever it might be—will want the powers to be able to say that a small platform is deeply harmful in terms of its content and reach. When the Bill has been passed, there will be pressure at some point in the future on a platform that is broadcasting or distributing or amplifying content that is deeply harmful. Although I will withdraw the amendment today, my noble friend’s offer of further conversations, and more detail on categorisation and of any review of the platforms as categorised as category 1, 2 and beyond, would be very helpful in due course. I beg leave to withdraw.