Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

I support these amendments, which were set out wonderfully by the noble Lord, Lord Stevenson. I want to raise a point made on Tuesday when the noble Baroness, Lady Merron, said that only 3% of people read terms of service and I said that 98% of people do not read them, so one of us is wrong, but I think the direction of travel is clear. She also used a very interesting phrase about prominence, and I want to use this opportunity to ask the Minister whether there is some lever whereby Ofcom can insist on prominence for certain sorts of material—a hierarchy of information, if you like—because these are really important pieces of information, buried in the wrong place so that even 2% or 3% of people may not find them.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I am very pleased that the noble Lord, Lord Stevenson, has given us the opportunity to talk about terms of service, and I will make three points again, in a shorter intervention than on the previous group.

First, terms of service are critical as the impact of terms of service will generally be much greater in terms of the amount of intervention that occurs on content than it will ever be under the law. Terms of service create, in effect, a body of private law for a community, and they are nearly always a superset of the public law—indeed, it is very common for the first items of a terms of service to say, “You must not do anything illegal”. This raises the interesting question of “illegal where?”—what it generally means is that you must not do anything illegal in the jurisdiction in which the service provider is established. The terms of service will say, “Do not do anything illegal”, and then they will give a whole list of other things, as well as illegality, that you cannot do on the platform, and I think this is right because they have different characteristics.

--- Later in debate ---
Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

My Lords, this is a very large and wide-ranging group of amendments. Within it, I have a number of amendments that, on their own, span three separate subjects. I propose to address these one after the other in my opening remarks, but other subjects will be brought in as the debate continues and other noble Lords speak to their own amendments.

If I split the amendments that I am speaking to into three groups, the first is Amendments 17 and 18. These relate to Clause 9, on page 7, where safety duties about illegal content are set out. The first of those amendments addresses the obligation to prevent individuals encountering priority illegal content by means of the service.

Earlier this week in Committee, I asked the Minister whether the Government understood “prevent” and “protect”, both of which they use in the legislation, to have different weight. I did not expect my noble friend to give an answer at that point, but I know that he will have reflected on it. We need clarity about this at some point, because courts will be looking at, listening to and reading what the Government say at the Dispatch Box about the weight to be given to these words. To my mind, to prevent something happening requires active measures in advance that ensure as far as reasonably and humanly possible that it does not actually happen, but one could be talking about something more reactive to protect someone from something happening.

This distinction is of great importance to internet companies—I am not talking about the big platforms—which will be placed, as I say repeatedly, under very heavy burdens by the Bill. It is possible that they simply will not be able to discharge them and will have to go out of business.

Let us take Wikipedia, which was mentioned earlier in Committee. It operates in 300 languages but employs 700 moderators globally to check what is happening. If it is required by Clause 9 to

“prevent individuals from encountering priority illegal content by means of the service”,

it will have to scrutinise what is put up on this community-driven website as or before it appears. Quite clearly, something such as Welsh Wikipedia—there is Wikipedia in Welsh—simply would not get off the ground if it had to meet that standard, because the number of people who would have to be employed to do that would be far more than the service could sustain. However, if we had something closer to the wording I suggest in my amendment, where services have to take steps to “protect” people—so they could react to something and take it down when they become aware of it—it all becomes a great deal more tolerable.

Similarly, Amendment 18 addresses subsection (3) of the same clause, where there is a

“duty to operate a service using proportionate systems and processes … to … minimise the length of time”

for which content is present. How do you know whether you are minimising the length of time? How is that to be judged? What is the standard by which that is to be measured? Would it not be a great deal better and more achievable if the wording I propose, which is that you simply are under an obligation to take it down, were inserted? That is my first group of amendments. I put that to my noble friend and say that all these amendments are probing to some extent at this stage. I would like to hear how he thinks that this can actually be operated.

My second group is quite small, because it contains only Amendment 135. Here I am grateful to the charity JUSTICE for its help in drawing attention to this issue. This amendment deals with Schedule 7, on page 202, where the priority offences are set out. Paragraph 4 of the schedule says that a priority offence includes:

“An offence under any of the following provisions of the Public Order Act 1986”.


One of those is Section 5 of that Act, “Harassment, alarm or distress”. Here I make a very different point and return to territory I have been familiar with in the past. We debated this only yesterday in Grand Committee, although I personally was unable to be there: the whole territory of hate crimes, harmful and upsetting words, and how they are to be judged and dealt with. In this case, my amendment would remove Section 5 of the Public Order Act from the list of priority offences.

If society has enough problems tolerating the police going round and telling us when we have done or said harmful and hurtful things and upbraiding us for it, is it really possible to consider—without the widest form of censorship—that it is appropriate for internet platforms to judge us, shut us down and shut down our communications on the basis of their judgment of what we should be allowed to say? We already know that there is widespread suspicion that some internet platforms are too quick to close down, for example, gender critical speech. We seem to be giving them something close to a legislative mandate to be very trigger-happy when it comes to closing down speech by saying that it engages, or could engage, Section 5 of the Public Order Act. I will come to the question of how they judge it in my third group, in a moment—but the noble Lord might be able to help me.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Just to reinforce the point the noble Lord, Lord Moylan, made on that, I certainly had experience of where the police became the complainants. They would request, for example, that you take down an English Defence League event, claiming that it would be likely to cause a public order problem. I have no sympathy whatever with the English Defence League, but I am very concerned about the police saying “You must remove a political demonstration” to a platform and citing the legal grounds for doing that. The noble Lord is on to a very valid point to be concerned about that.

Lord Moylan Portrait Lord Moylan (Con)
- Hansard - - - Excerpts

I am grateful to the noble Lord. I really wonder whether the Government realise what they are walking into here. On the one hand, yesterday the Grand Committee was debating the statutory instrument putting in place new statutory guidance for the police on how to enforce, much more sensitively than in the past, non-crime hate incidents. However, on the other hand, the next day in this Chamber we are putting an obligation on a set of mostly foreign private companies to act as a police force to go around bullying us and closing us down if we say something that engages Section 5 of the Public Order Act. I think this is something the Government are going to regret, and I would very much like to hear what my noble friend has to say about that.

Finally, I come to my third group of amendments: Amendments 274, 278, 279 and 283. They are all related and on one topic. These relate to the text of the Bill on page 145, in Clause 170. Here we are discussing what judgments providers have to make when they come to decide what material to take down. Inevitably, they will have to make judgments. That is one of the unfortunate things about this Bill. A great deal of what we do in our lives is going to have to be based on judgments made by private companies, many of which are based abroad but which we are trying to legislate for.

It makes a certain sense that the law should say what they should take account of in making those judgments. But the guidance—or rather, the mandate—given to those companies by Clause 170 is, again, very hair-trigger. Clause 170(5), which I am proposing we amend, states:

“In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is … of the kind in question”.


I am suggesting that “reasonable grounds to infer” should be replaced with “sufficient evidence to infer”, so that they have to be able to produce some evidence that they are justified in taking content down. The test should be higher than simply having “reasonable grounds”, which may rest on a suspicion and little evidence at all. So one of those amendments relates to strengthening that bar so that they must have real evidence before they can take censorship action.

I add only two words to subsection (6), which talks about reasonable grounds for the inference—it defines what the reasonable grounds are—that

“exist in relation to content and an offence if, following the approach in subsection (2)”

and so on. I am saying “if and only if”—in other words, I make it clear that this is the only basis on which material can be censored using the provisions in this section, so as to limit it from going more widely. The third amendment in my group is essentially consequential to that.

--- Later in debate ---
Lord Bishop of Guildford Portrait The Lord Bishop of Guildford
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendments 128, 130 and 132, as well as Amendments 143 to 153 in this grouping. They were tabled in the name of my right reverend colleague the Bishop of Derby, who is sorry that she cannot be here today.

The Church of England is the biggest provider of youth provision in our communities and educates around 1 million of our nation’s children. My colleague’s commitment to the principles behind these amendments also springs from her experience as vice chair of the Children’s Society. The amendments in this grouping are intended to strengthen legislation on online grooming for the purpose of child criminal exploitation, addressing existing gaps and ensuring that children are properly protected. They are also intended to make it easier for evidence of children being groomed online for criminal exploitation to be reported by online platforms to the police and the National Crime Agency.

Research from 2017 shows that one in four young people reported seeing illicit drugs advertised for sale on social media—a percentage that is likely to be considerably higher six years on. According to the Youth Endowment Fund in 2022, 20% of young people reported having seen online content promoting gang membership in the preceding 12 months, with 24% reporting content involving the carrying, use or promotion of weapons.

In relation to drugs, that later research noted that these platforms provide opportunities for dealers to build trust with potential customers, with young people reporting that they are more likely to see a groomer advertising drugs as a friend than as a dealer. This leaves young people vulnerable to exploitation, thereby reducing the scruples or trepidation they might feel about buying drugs in the first place. Meanwhile, it is also clear that social media is changing the operation of the county lines model. There is no longer the need to transport children from cities into the countryside to sell drugs, given that children who live in less populated areas can be groomed online as easily as in person. A range of digital platforms is therefore being used to target potential recruits among children and young people, with digital technologies also being deployed—for example, to monitor their whereabouts on a drugs run.

More research is being carried out by the Children’s Society, whose practitioners reported a notable increase in the number of perpetrators grooming children through social media and gaming sites during the first and second waves of the pandemic. Young people were being contacted with promotional material about lifestyles they could lead and the advantages of working within a gang, and were then asked to do jobs in exchange for money or status within this new group. It is true that some such offences could be prosecuted under the Modern Slavery Act 2015, but there remains a huge disparity between the scale of exploitation and the number of those being charged under the Act. Without a definition of child exploitation for criminal purposes, large numbers of children are being groomed online and paying the price for crimes committed by some of their most dangerous and unscrupulous elders.

It is vital that we protect our children from online content which facilitates that criminal exploitation, in the same way that we are looking to protect them from sexual exploitation. Platforms must be required to monitor for illegal content related to child criminal exploitation on their sites and to have mechanisms in place for users to flag it with those platforms so it can be removed. This can be achieved by including modern slavery and trafficking, of which child criminal exploitation is a form, into the scope of illegal content within the Bill, which is what these amendments seek to do. It is also vital that the law sets out clear expectations on platforms to report evidence of child criminal exploitation to the National Crime Agency in the same way as they are expected to report content involving child sexual exploitation and abuse to enable child victims to be identified and to receive support. Such evidence may enable action against the perpetrators without the need of a disclosure from child victims. I therefore fully support and endorse the amendments standing in the name of the right reverend Prelate.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, this is again a very helpful set of amendments. I want to share some experience that shows that legality tests are really hard. Often from the outside there is an assumption that it is easy to understand what is legal and illegal in terms of speech, but in practice that is very rarely the case. There is almost never a bright line, except in a small class of child sexual abuse material where it is always illegal and, as soon as you see the material, you know it is illegal and you can act on it. In pretty much every other case, you have to look at what is in front of you.

I will take a very specific example. Something we had to deal with was images of Abdullah Öcalan, the leader of the PKK in Turkey. If somebody shared a picture of Abdullah Öcalan, were they committing a very serious offence, which is the promotion of terrorism? Were they indicating support for the peace process that was taking place in Turkey? Were they showing that they support his socialist and feminist ideals? Were they supporting the YPG, a group in Syria to which we were sending arms, that venerates him? This is one example of many I could give where the content in front of you does not tell you very clearly whether or not the speech is illegal or speech that should be permitted. Indeed, we would take speech like that down and I would get complaints, including from Members of Parliament, saying, “Why have you removed that speech? I’m entitled to talk about Abdullah Öcalan”, and we would enter into an argument with them.

We would often ask lawyers in different countries whether they could tell us whether a speech was legal or illegal. The answer would come back as probably illegal, likely illegal, maybe illegal and, occasionally, definitely not illegal, but it was nearly always on the spectrum. The amendments we are proposing today are to try to understand where the Government intend people to draw that line when they get that advice. Let us assume the company wants to do the right thing and follow the instructions of the Bill and remove illegal content. At what level do they say it has met the test sufficiently, given that in the vast majority of cases, apart from the small class of illegal content, they are going to be given only a likelihood or a probability? As the noble Lord, Lord Moylan, pointed out, we have to try to insert this notion of sufficient evidence with Amendments 273, 275, 277, 280 and 281 in the names of my noble friend Lord Clement-Jones and the noble Viscount, Lord Colville, who is unable to be in his place today. I think the noble Baroness, Lady Kidron, may also have signed them. We are trying to flesh out the point at which that illegality standard should kick in.

Just to understand again how this often works when the law gets involved, I say that there is a law in Germany; the short version is NetzDG. If there are any German speakers who can pronounce the compound noun that is its full title, there will be a prize. It is a long compound word that means “network enforcement Act”. It has been in place for a few years and it tells companies to do something similar—to remove content that is illegal in Germany. There would be cases where we would get a report from somebody saying, “This is illegal”, and we would take action; then it went into the German system and three months later we would finally get told whether it was actually illegal in a 12-page judgment that a German court had figured out. In the meantime, all we could do was work on our best guess while that process was going on. I think we need to be very clear that illegality is hard.

Cross-jurisdictional issues present us with another set of challenges. If both the speaker and the audience are in the United Kingdom, it is fairly clear. But in many cases, when we are talking about online platforms, one or other, or even both the speaker and the audience, may be outside the United Kingdom. Again, when does the speech become illegal? It may be entirely legal speech between two people in the United States. I think—and I would appreciate clarification from the Minister—that the working assumption is that if the speech was reported by someone not in the United State but in the UK, the platform would be required to restrict access to it from the UK, even though the speech is entirely legal in the jurisdiction in which it took place. Because the person in the UK encountered it, there would be a duty to restrict it. Again, it has been clarified that there is certainly not a duty to take the speech down, because it is entirely legal speech outside the UK. These cross-jurisdictional issues are interesting; I hope the Minister can clarify that.

The amendments also try to think about how this would work in practice. Amendment 287 talks about how guidance should be drawn up in consultation with UK lawyers. That is to avoid a situation where platforms are guessing too much at what UK lawyers want; they should at least have sought UK legal advice. That advice will then be fed into the guidance given to their human reviewers and their algorithms. That is the way, in practice, in which people will carry out the review. There is a really interesting practical question—which, again, comes up under NetzDG—about the extent to which platforms should be investing in legal review of content that is clearly against their terms of service.

There will be two kinds of platform. There will be some platforms that see themselves as champions of freedom of expression and say they will only remove stuff that is illegal in the UK, and everything else can stay up. I think that is a minority of platforms—they tend to be on the fringes. As soon as a platform gets a mainstream audience, it has to go further. Most platforms will have terms of service that go way beyond UK law. In that case, they will be removing the hate speech, and they will be confident that they will remove UK-illegal hate speech within that. They will remove the terrorist content. They will be confident and will not need to do a second test of the legality in order to be able to remove that content. There is a practical question about the extent to which platforms should be required to do a second test if something is already illegal under their terms.

There will be, broadly speaking again, four buckets of content. There will be content that is clearly against a platform’s terms, which it will want to get rid of immediately. It will not want to test it again for legality; it will just get rid of it.

There will be a second bucket of content that is not apparently against a platform’s terms but clearly illegal in the UK. That is a very small subset of content: in Germany, that is Holocaust denial content; in the United Kingdom, this Parliament has looked at Holocaust denial and chosen not to criminalise it, so that will not be there, but an equivalent for us would be migration advice. Migration advice will not be against the terms of service of most platforms, but in the Government’s intention, the Illegal Migration Bill is to make it illegal and require it to be removed, and the consequent effect will be that it will have to be removed under the terms of this Bill. So there will be that small set of content that is illegal in the UK but not against terms of service.

There will be a third bucket of content that is not apparently against the terms or the law, and that actually accounts for most of the complaints that a platform gets. I will choose my language delicately: complaint systems are easy, and people complain to make a point. They use complaint systems such as dislike buttons. The reality is that one of the most common sets of complaints you get is when there is a football match and the two opposing teams report the content on each other’s pages as illegal. They will do that every time, and you get used to it, and that is why you learn to discount mass-volume complaints. But again, we should be clear that there are a great many complaints that are merely vexatious.

The final bucket is of content that is unclear and legal review will be needed. Our amendment is intended to deal with those. A platform will go out and get advice. It is trying to understand at what point something like migration advice tips over into the illegal as opposed to being advice about going on holiday, and it is trying to understand that based on what it can immediately see. Once it has sought that advice, it will feed that back into the guidance to reviewers and the algorithms to try and remove content more effectively and be compliant with the Bill as a whole and not get into trouble with Ofcom.

Some areas are harder than others. The noble Lord, Lord Moylan, already highlighted one: public order offences, which are extremely hard. If somebody says something offensive or holds an offensive political view—I suspect the noble Baroness, Lady Fox, may have something to say on this—people may well make contact and claim that it is in breach of public order law. On the face of it, they may have a reasonably arguable case but again, as a platform, you are left to make a decision.

--- Later in debate ---
I noted earlier that the noble Lord, Lord Bethell, made a passionate intervention about, of all things, Andrew Tate and his illegality in relation to this Bill. That prompted me to think a number of things. Andrew Tate is an influencer who I despise, as I do the kind of things he says. But, as far as I know, the criminal allegations he faces are not yet resolved, so he has to be seen as innocent until proven guilty. Most of what he has online that is egregious might well be in bad taste, as people say—I would say that it is usually misogynist—but it is not against the law. If we get to a situation where that is described as illegality, that is the kind of thing that I worry about. As we have heard from other noble Lords, removing so-called illegal content for the purpose of complying with this regulatory system will mean facing such dilemmas.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

In talking about individuals and investigations, the noble Baroness reminded me of one class of content where we do have clarity, and that is contempt of court. That is a frequent request. We know that it is illegal in that case because a judge writes to the company and says, “You must not allow this to be said because it is in contempt of court”, but that really is the exception. In most other cases, someone is saying, “I think it is illegal”. In live proceedings, in most cases it is absolutely clear because a judge has told you.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

That is very helpful.

I am concerned that removing so-called illegal content for the purpose of complying with the regulatory system covers not only that which reaches conviction in a criminal court but possibly anything that a platform determines could be illegal, and therefore it undermines our own legal system. As I have said, that marks a significant departure from the rule of law. It seems that the state is asking or mandating private companies to make determinations about what constitutes illegality.

The obligations on a platform to determine what constitutes illegality could obviously become a real problem, particularly in relation to limitations on free expression. As we have already heard, the Public Order Act 1986 criminalises, for example, those who stir up hatred through the use of words, behaviour or written material. That is contentious in the law offline. By “contentious”, I mean that it is a matter of difficulty that requires the full rigour of the criminal justice system, understanding the whole history of established case law. That is all necessary to make a conviction under that law for offences of this nature.

Now we appear to be saying that, without any of that, social media companies should make the decision, which is a nerve-racking situation to be in. We have already heard the slippery phrase “reasonable grounds to infer”. If that was the basis on which you were sent to prison—if they did not have to prove that you were guilty but they had reasonable grounds to infer that you might be, without any evidence—I would be worried, yet reasonable grounds to infer that the content could be illegal is the basis on which we are asking for those decisions to be made. That is significantly below the ordinary burden of proof required to determine that an illegal act has been committed. Under this definition, I fear that platforms will be forced to overremove and censor what ultimately will be entirely lawful speech.

Can the Minister consider what competency social media companies have to determine what is lawful? We have heard some of the dilemmas from somebody who was in that position—let alone the international complications, as was indicated. Will all these big tech companies have to employ lots of ex-policemen and criminal lawyers? How will it work? It seems to me that there is a real lack of qualifications in that sphere— that is not a criticism, because those people decided to work in big tech, not in criminal law, and yet we are asking them to pursue this. That is a concern.

I will also make reference to what I think are the controversies around government Amendments 136A and 136B to indicate the difficulties of these provisions. They concern illegal activity—such as “assisting unlawful immigration”, illegal entry, human trafficking and similar offences—but I am unsure as to how this would operate. While it is the case that certain entrances to the UK are illegal, I suddenly envisage a situation where a perfectly legitimate political debate—for example, about the small boats controversy—would be taken down, and that people advocating for a position against the Government’s new Illegal Migration Bill could be accused of supporting illegality. What exactly will be made illegal in those amendments to the Online Safety Bill?

The noble Baroness, Lady Buscombe, made a fascinating speech about an interesting group of amendments. Because of the way the amendments are grouped, I feel that we have moved to a completely different debate, so I will not go into any detail on this subject. Anonymous trolling, Twitter storms and spreading false information are incredibly unpleasant. I am often the recipient of them—at least once a week—so I know personally that you feel frustrated that people tell lies and your reputation is sullied. However, I do not think that these amendments offer the basis on which that activity should be censored, and I will definitely argue against removing anonymity clauses—but that will be in another group. It is a real problem, but I do not think that the solution is contained in these amendments.

--- Later in debate ---
Lord Bethell Portrait Lord Bethell (Con)
- Hansard - - - Excerpts

I defer to the noble Baroness, Lady Fox, on speech crime. That is not the area of my expertise, and it is not the purpose of my points. My points were to do with the kinds of crime that affect children in particular. His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services is very specific about that point. It says that “unacceptable delays are commonplace” and it gives a very large number of case studies. I will not go through them now because it is Thursday afternoon, but I think noble Lords can probably imagine the kinds of things we are talking about. They include years of delay, cases not taken seriously or overlooked, evidence lost, and so forth. The report found that too often children were put at risk because of this, and offenders were allowed to escape justice, and it gave 17 recommendations for how the police force should adapt in order to meet this challenge.

So my questions to the Minister are these. When we talk about things such as age verification for hardcore porn, we are quite often told that we do not need to worry about some of this because it is covered by illegal content provisions, and we should just leave it to the police to sort out. His Majesty’s Inspectorate gives clear evidence—this is a recent report from last month—that this is simply not happening in the way it should be. I therefore wondered what, if anything, is in the Bill to try to close down this particular gap. That would be very helpful indeed.

If it is really not for the purposes of this Bill at all—if this is actually to do with other laws and procedures, other departments and the way in which the resources for the police are allocated, as the noble Baroness, Lady Fox, alluded to—what can the Government do outside the boundaries of this legislation to mobilise the police and the prosecution services to address what I might term “digital crimes”: that is, crimes that would be followed up with energy if they occurred in the real world but, because they are in the digital world, are sometimes overlooked or forgotten?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, I would like to mention one issue that I forgot to mention, and I think it would be more efficient to pose the question now to the Minister rather than interject when he is speaking.

On the Government’s Amendments 136A, 136B and 136C on the immigration offences, the point I want to make is that online services can be literal life-savers for people who are engaged in very dangerous journeys, including journeys across the Channel. I hope the Minister will be clear that the intention here is to require platforms to deal only with content, for example, from criminals who are offering trafficking services, and that there is no intention to require platforms somehow to withdraw services from the victims of those traffickers when they are using those services in the interest of saving their own lives or seeking advice that is essential to preserving their own safety.

That would create—as I know he can imagine—real ethical and moral dilemmas, and we should not be giving any signal that we intend to require platforms to withdraw services from people who are in desperate need of help, whatever the circumstances.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, we seem to have done it again—a very long list of amendments in a rather ill-conceived group has generated a very interesting discussion. We are getting quite good at this, exchanging views across the table, across the Committee, even within the Benches—Members who perhaps have not often talked together are sharing ideas and thoughts, and that is a wonderful feeling.

I want to start with an apology. I think I may be the person who got the noble Baroness, Lady Kidron, shopped by the former leader—once a leader, always a leader. What I thought I was being asked was whether the Committee would be interested in hearing the views of the noble Viscount who could not be present, and I was very keen, because when he does speak it is from a point of view that we do not often hear. I did not know that it was a transgression of the rules—but of course it is not, really, because we got round it. Nevertheless, I apologise for anything that might have upset the noble Baroness’s blood pressure—it did not stop her making a very good contribution later.

We have covered so much ground that I do not want to try and summarise it in one piece, because you cannot do that. The problem with the group as it stands is that the right reverend Prelate the Bishop of Derby and myself must have some secret connection, because we managed to put down almost the same amendments. They were on issues that then got overtaken by the Minister, who finally got round to—I mean, who put down a nice series of amendments which exactly covered the points we made, so we can lose all those. But this did not stop the right reverend Prelate the Bishop of Guildford making some very good additional points which I think we all benefited from.

I welcome back the noble Baroness, Lady Buscombe, after her illness; she gave us a glimpse of what is to come from her and her colleagues, but I will leave the particular issue that she raised for the Minister to respond to. It raises an issue that I am not competent on, but it is a very important one—we need to get the right balance between what is causing the alarm and difficulty outside in relation to what is happening on the internet, and I think we all agree with her that we should not put any barrier in the way of dealing with that.

Indeed, that was the theme of a number of the points that have been raised on the question of what is or can constitute illegal content, and how we judge it. It is useful to hear again from the master about how you do it in practice. I cannot imagine being in a room of French lawyers and experts and retaining my sanity, let alone making decisions that affect the ability of people to carry on, but the noble Lord did it; he is still here and lives to tell the tale—bearded or otherwise.

The later amendments, particularly from the noble Lord, Lord Clement-Jones, are taking us round in a circle towards the process by which Ofcom will exercise the powers that it is going to get in this area. These are probably worth another debate on their own, and maybe it will come up in a different form, because—I think the noble Baroness, Lady Stowell, made this point as well—there is a problem in having an independent regulator that is also the go-to function for getting advice on how others have to make decisions that are theirs to rule on at the end if they go wrong. That is a complicated way of saying that we may be overloading Ofcom if we also expect it to provide a reservoir of advice on how you deal with the issues that the Bill puts firmly on the companies—I agree that this is a problem that we do not really have an answer to.

My amendments were largely overtaken by the Government’s amendments, but the main one I want to talk about was Amendment 272. I am sorry that the noble Baroness, Lady Morgan, is not here, because her expertise is in an area that I want to talk about, which is fraud—cyber fraud in particular—and how that is going to be brought into the Bill. The issue, which I think has been raised by Which?, but a number of other people have also written to us about it, is that the Bill in Clauses 170 and 171 is trying to establish how a platform should identify illegal content in relation to fraud—but it is quite prescriptive. In particular, it goes into some detail which I will leave for the Minister to respond to, but uniquely it sets out a specific way for gathering information to determine whether content is illegal in this area, although it may have applicability in other areas.

One of the points that have to be taken into account is whether the platform is using human moderators, automated systems or a combination of the two. I am not quite sure why that is there in the Bill; that is really the basis for the tabling of our amendments. Clearly, one would hope that the end result is whether or not illegality has taken place, not how that information has been gathered. If one must make concessions to the process of law because a judgment is made that, because it is automated, it is in some way not as valid as if it had been done by a human moderator, there seems to be a whole world there that we should not be going into. I certainly hope that that is not going to be the case if we are talking about illegality concerning children or other vulnerable people, but that is how the Bill reads at present; I wonder whether the Minister can comment on that.

There is a risk of consumers being harmed here. The figures on fraud in the United Kingdom are extraordinary; the fact that it is not the top priority for everybody, let alone the Government, is extraordinary. It is something like the equivalent of consumers being scammed at the rate of around £7.5 billion per year. A number of awful types of scamming have emerged only because of the internet and social media. They create huge problems of anxiety and emotional distress, with lots of medical care and other things tied in if you want to work out the total bill. So we have a real problem here that we need to settle. It is great that it is in the Bill, but it would be a pity if the movement towards trying to resolve it is in any way infringed on by there being imperfect instructions in the Bill. I wonder whether the Minister would be prepared to respond to that; I would be happy to discuss it with him later, if that is possible.

As a whole, this is an interesting question as we move away from what a crime is towards how people judge how to deal with what they think is a crime but may not be. The noble Lord, Lord Allan, commented on how to do it in practice but one hopes that any initial problems will be overcome as we move forward and people become more experienced with this.

When the Joint Committee considered this issue, we spent a long time talking about why we were concerned about having certainty on the legal prescription in the Bill; that is why we were very much against the idea of “legal but harmful” because it seemed too subjective and too subject to difficulties. Out of that came another thought, which answers the point made by the noble Baroness, Lady Stowell: so much of this is about fine judgments on certain things that are there in stone and that you can work to but you then have to interpret them.

There is a role for Parliament here, I think; we will come on to this in later amendments but, if there is a debate to be had on this, let us not forget the points that have been made here today. If we are going to think again about Ofcom’s activity in practice, that is the sort of thing that either a Joint Committee or Select Committees of the two Houses could easily take on board as an issue that needs to be reflected on, with advice given to Parliament about how it might be taken forward. This might be the answer in the medium term.

In the short term, let us work to the Bill and make sure that it works. Let us learn from the experience but let us then take time out to reflect on it; that would be my recommendation but, obviously, that will be subject to the situation after we finish the Bill. I look forward to hearing the Minister’s response.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, that would be welcome.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Can I suggest one of mine?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I thank the noble Lord.

I was pleased to hear about Wicipedia Cymraeg—there being no “k” in Welsh. As the noble Lord, Lord Stevenson, said, there has been a very good conversational discussion in this debate, as befits Committee and a self-regulating House. My noble friend Lady Stowell is right to point out matters of procedure, although we were grateful to know why the noble Viscount, Lord Colville, supports the amendments in question.

--- Later in debate ---
Companies will need to ensure that they have effective systems to enable them to check the broader context relating to content when deciding whether or not to remove it. This will provide greater certainty about the standard to be applied by providers when assessing content, including judgments about whether or not content is illegal. We think that protects against over-removal by making it clear that platforms are not required to remove content merely on the suspicion of it being illegal. Beyond that, the framework also contains provisions about how companies’ systems and processes should approach questions of mental states and defences when considering whether or not content is an offence in the scope of the Bill.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am struggling a little to understand why the Minister thinks that sufficient evidence is subjective, and therefore, I assume, reasonable grounds to infer is objective. Certainly, in my lexicon, evidence is more objective than inference, which is more subjective. I was reacting to that word. I am not sure that he has fully made the case as to why his wording is better.

Lord Moylan Portrait Lord Moylan (Con)
- View Speech - Hansard - - - Excerpts

Or indeed any evidence.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

Again, I think that that is clear. I understood from the Bill that, if an American says something that would be illegal were they to be in the United Kingdom, we would still want to exclude that content. But that still leaves it open, and I just ask the question again, for confirmation. If all of the activities are outside the United Kingdom—Americans talking to each other, as it were—and a British person objects, at what point would the platform be required to restrict the content of the Americans talking to each other? Is it pre-emptively or only as and when somebody in the United Kingdom objects to it? We should flesh out that kind of practical detail before this becomes law.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

If it has been committed in the UK and is viewed by a UK user, it can be treated as illegal. I will follow up on the noble Lord’s further points ahead of the next stage.

Amendment 272 explicitly provides that relevant information that is reasonably available to a provider includes information submitted by users in complaints. Providers will already need to do this when making judgments about content, as it will be both relevant and reasonably available.

My noble friend Lord Moylan returned to the question that arose on day 2 in Committee, querying the distinction between “protect” and “prevent”, and suggesting that a duty to protect would or could lead to the excessive removal of content. To be clear, the duty requires platforms to put in place proportionate systems and processes designed to prevent users encountering content. I draw my noble friend’s attention to the focus on systems and processes in that. This requires platforms to design their services to achieve the outcome of preventing users encountering such content. That could include upstream design measures, as well as content identification measures, once content appears on a service. By contrast, a duty to protect is a less stringent duty and would undermine the proactive nature of the illegal content duties for priority offences.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support this group of amendments, so ably introduced by my noble friend and other noble Lords this afternoon.

I am not a lawyer and I would not say that I am particularly experienced in this business of legislating. I found this issue incredibly confusing. I hugely appreciate the briefings and discussions—I feel very privileged to have been included in them—with my noble friend the Minister, officials and the Secretary of State herself in their attempt to explain to a group of us why these amendments are not necessary. I was so determined to try to understand this properly that, yesterday, when I was due to travel to Surrey, I took all my papers with me. I got on the train at Waterloo and started to work my way through the main challenges that officials had presented.

The first challenge was that, fundamentally, these amendments cut across the Bill’s definitions of “primary priority content” and “priority content”. I tried to find them in the Bill. Unfortunately, in Clause 54, there is a definition of primary priority content. It says that, basically, primary priority content is what the Secretary of State says it is, and that content that is harmful to children is primary priority content. So I was none the wiser on Clause 54.

One of the further challenges that officials have given us is that apparently we, as a group of noble Lords, were confusing the difference between harm and risk. I then turned to Clause 205, which comes out with the priceless statement that a risk of harm should be read as a reference to harm—so maybe they are the same thing. I am still none the wiser.

Yesterday morning, I found myself playing what I can only describe as a parliamentary game of Mornington Crescent, as I went round and round in circles. Unfortunately, it was such a confusing game of Mornington Crescent that I forgot that I needed to change trains, ended up in Richmond instead of Redhill, and missed my meeting entirely. I am telling the Committee this story because, as the debate has shown, it is so important that we put in the Bill a definition of the harms that we are intending to legislate for.

I want to address the points made by the noble Baroness, Lady Fox. She said that we might not all agree on what harms are genuinely harmful for children. That is precisely why Parliament needs to decide this, rather than abdicate it to a regulator who, as other noble Lords said earlier today, is then put into a political space. It is the job of Parliament to decide what is dangerous for our children and what is not. That is the approach that we take in the physical world, and it should be the approach that we take in the online world. We should do that in broad categories, which is why the four Cs is such a powerful framework. I know that we are all attempting to predict the known unknowns, which is impossible, but this framework, which gives categories of harm, is clear that it can be updated, developed and, as my noble friend Lord Bethell, said, properly consulted on. We as parliamentarians should decide; that is the purpose of voting in Parliament.

I have a couple of questions for my noble friend the Minister. Does he agree that Parliament needs to decide what the categories of online harms are that the Bill is attempting to protect our children from? If he does, why is it not the four Cs? If he really thinks it is not the four Cs, will he bring back an alternative schedule of harms?

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I will echo the sentiments of the noble Baroness, Lady Harding, in my contribution to another very useful debate, which has brought to mind the good debate that we had on the first day in Committee, in response to the amendment tabled by the noble Lord, Lord Stevenson, in which we were seeking to get into the Bill what we are actually trying to do.

I thought that the noble Baroness, Lady Fox, was also welcoming additional clarity, specifically in the area of psychological harm, which I agree with. Certainly in its earlier incarnations, the Bill was scattered throughout with references, some of which have been removed, but they are very much open to interpretation. I hope that we will come back to that.

I was struck by the point made by the noble Lord, Lord Russell, around what took place in that coroner’s hearing. You had two different platforms with different interpretations of what they thought that their duty of care would be. That is very much the point. In my experience, platforms will follow what they are told to follow. The challenge is when each of them comes to their own individual view around what are often complex areas. There we saw platforms presenting different views about their risk assessments. If we clarify that for them through amendments such as these, we are doing everyone a favour.

I again compliment my noble friend Lady Benjamin for her work in this area. Her speech was also a model of clarity. If we can bring some of that clarity to the legislation and to explaining what we want, that will be an enormous service.

The noble Lord, Lord Knight, made some interesting points around how this would add value to the Bill, teasing out some of the specific gaps that we have there. I look forward to hearing the response on that.

I was interested in the comments from the noble Lord, Lord Bethell, on mobile phone penetration. We should all hold in common that we are not going back to a time BC—before connection. Our children will be connected, which creates the imperative for us to get this right. There has perhaps been a tendency for us to bury our heads in the sand, and occasionally you hear that still—it is almost as if we would wish this world away. However, the noble Baroness, Lady Kidron, is at the other end of the spectrum; she has come alive on this subject, precisely because she recognises that that will not happen. We are in a world where our children will be connected, so it is on us to figure out how we want those connections to work and to instruct the people who provide those connective services on what they should do. It is certainly not for us to imagine that somehow they will all go away. We will come to that in later groups when we talk about minimum ages; if younger children are online, there is a real issue around how we are going to deal with that.

The right reverend Prelate the Bishop of Oxford highlighted some really important challenges based on real experiences that families today are suffering—let us use the word as it should be—and made the case for clarity. I do not know how much we are allowed to talk in praise of EU legislation, but I am looking at the Digital Services Act—I have looked at a lot of EU legislation—and this Bill, and there is a certain clarity to EU regulation, particularly the process of adding recitals, which are attached to the law and explain what it is meant to do. That is sometimes missing here. I know that there are different legal traditions, but you can sometimes look at an EU regulation and the UK law and the former appears to be much clearer in its intent.

That brings me to the substance of my comments in response to this group, so ably introduced by the noble Baroness, Lady Kidron. I hope that the Government heed and recognise that, at present, no ordinary person can know what is happening in the Bill—other than, perhaps, the wife of the noble Lord, Lord Stevenson, who will read it for fun—and what we intend to do.

I was thinking back to the “2B or not 2B” debate we had earlier about the lack of clarity around something even as simple as the classification of services. I was also thinking that, if you ask what the Online Safety Bill does to restrict self-harm content, the answer would be this: if it is a small social media platform, it will probably be categorised as a 2B service, then we can look at Schedule 7, where it is prohibited from assisting suicide, but we might want to come back to some of the earlier clauses with the specific duties—and it will go on and on. As the noble Baroness, Lady Harding, described, you are leaping backwards and forwards in the Bill to try to understand what we are trying to do with the legislation. I think that is a genuine problem.

In effect, the Bill is Parliament setting out the terms of service for how we want Ofcom to regulate online services. We debated terms of service earlier. What is sauce for the goose is sauce for the gander. We are currently failing our own tests of simplicity and clarity on the terms of service that we will give to Ofcom.

As well as platforms, if ordinary people want to find out what is happening, then, just like those platforms with the terms of service, we are going to make them read hundreds of pages before they find out what this legislation is intended to do. We can and should make this simpler for children and parents. I was able to meet Ian Russell briefly at the end of our Second Reading debate. He has been an incredibly powerful and pragmatic voice on this. He is asking for reasonable things. I would love to be able to give a Bill to Ian Russell, and the other families that the right reverend Prelate the Bishop of Oxford referred to, that they can read and that tells them very clearly how Parliament has responded to their concerns. I think we are a long way short of that simple clarity today.

It would be extraordinarily important for service providers, as I already mentioned in response to the noble Lord, Lord Russell. They need that clarity, and we want to make sure that they have no reason to say, “I did not understand what I was being asked to do”. That should be from the biggest to the smallest, as the noble Lord, Lord Moylan, keeps rightly raising with us. Any small service provider should be able to very clearly and simply understand what we are intending to do, and putting more text into the Bill that does that would actually improve it. This is not about adding a whole load of new complications and the bells and whistles we have described but about providing clarity on our intention. Small service providers would benefit from that clarity.

The noble Baroness, Lady Ritchie, rightly raised the issue of the speed of the development of technology. Again, we do not want the small service provider in particular to think it has to go back and do a whole new legal review every time the technology changes. If we have a clear set of principles, it is much quicker and simpler for it to say, “I have developed a new feature. How does it match up against this list?”, rather than having to go to Clause 12, Clause 86, Clause 94 and backwards and forwards within the Bill.

It will be extraordinarily helpful for enforcement bodies such as Ofcom to have a yardstick—again, this takes us back to our debate on the first day—for its prioritisation, because it will have to prioritise. It will not be able to do everything, everywhere, all at once. If we put that prioritisation into the legislation, it will, frankly, save potential arguments between Parliament, the Government and Ofcom later on, when they have decided to prioritise X and we wanted them to prioritise Y. Let us all get aligned on what we are asking them to do up front.

Dare I say—the noble Baroness, Lady Harding, reminded me of this—that it may also be extraordinarily helpful for us as politicians so that we can understand the state of the law. I mean not just the people who are existing specialists or are becoming specialists in this area and taking part in this debate but the other hundreds of Members of both Houses, because this is interesting to everyone. I have experience of being in the other place, and every Member of the other place will have constituents coming to them, often with very tragic circumstances, and asking what Parliament has done. Again, if they have the Online Safety Bill as currently drafted, I think it is hard for any Member of Parliament to be able to say clearly, “This is what we have done”. With those words and that encouraging wind, I hope the Government are able to explain, if not in this way, that they have a commitment to ensuring that we have that clarity for everybody involved in this process.