(1 year, 6 months ago)
Lords ChamberMy Lords, in view of the hour, I will be brief, and I have no interests to declare other than that I have grandchildren. I rise to speak to a number of amendments tabled in my name in this group: Amendments 216A to 216C, 218ZZA to 218ZD and 218BA to 218BC. I do not think I have ever achieved such a comprehensive view of the alphabet in a number of amendments.
These amendments carry a simple message: Ofcom must act decisively and quickly. I have tabled them out of a deep concern that the Bill does not specify timescales or obligations within which Ofcom is required to act. It leaves Ofcom, as the regulator, with huge flexibility and discretion as to when it must take action; some action, indeed, could go on for years.
Phrases such as
“OFCOM may vary a confirmation decision”
or it
“may apply to the court for an order”
are not strong enough, in my view. If unsuitable or harmful material is populating social media sites, the regulator must take action. There is no sense of urgency within the drafting of the Bill. If contravention is taking place, action needs to be taken very quickly. If Ofcom delays taking an action, the harmful influence will continue. If the providers of services know that the regulator will clamp down quickly and severely on those who contravene, they are more likely to comply in the first place.
I was very taken by the earlier comments of the noble Baroness, Lady Harding, about putting additional burdens on Ofcom. These amendments are not designed to put additional burdens on Ofcom; indeed, the noble Lord, Lord Knight, referred to the fact that, for six years, I chaired the Better Regulation Executive. It was my experience that regulators that had a reputation for acting quickly and decisively, and being tough, had a much more compliant base as a consequence.
Noble Lords will be pleased to hear that I do not intend to go through each individual amendment. They all have a single purpose: to require the regulator—in this case, Ofcom—to act when necessary, as quickly as possible within specified timescales; and to toughen up the Bill to reduce the risk of continuous harmful content being promoted on social media.
I hope that the Minister will take these comments in the spirit in which they are intended. They are designed to help Ofcom and to help reduce the continuous adverse influence that many of these companies will propagate if they do not think they will be regulated severely.
My Lords, I understand that, for legislation to have any meaning, it has to have some teeth and you have to be able to enforce it; otherwise, it is a waste of time, especially with something as important as the legislation that we are discussing here.
I am a bit troubled by a number of the themes in these amendments and I therefore want to ask some questions. I saw that the Government had tabled these amendments on senior manager liability, then I read amendments from both the noble Lord, Lord Bethell, and the Labour Party, the Opposition. It seemed to me that even more people would be held liable and responsible as a result. I suppose I have a dread that—even with the supply chain amendment—this means that lots of people are going to be sacked. It seems to me that this might spiral dangerously out of control and everybody could get caught up in a kind of blame game.
I appreciate that I might not have understood, so this is a genuine attempt to do so. I am concerned that these new amendments will force senior managers and, indeed, officers and staff to take an extremely risk-averse approach to content moderation. They now have not only to cover their own backs but to avoid jail. One of my concerns has always been that this will lead to the over-removal of legal speech, and more censorship, so that is a question I would like to ask.
I also want to know how noble Lords think this will lie in relation to the UK being a science and technology superpower. Understandably, some people have argued that these amendments are making the UK a hostile environment for digital investment, and there is something to be balanced up there. Is there a risk that this will lead to the withdrawal of services from the UK? Will it make working for these companies unattractive to British staff? We have already heard that Jimmy Wales has vowed that the Wikimedia foundation will not scrutinise posts in the way demanded by the Bill. Is he going to be thrown in prison, or will Wikipedia pull out? How do we get the balance right?
What is the criminal offence that has a threat of a prison sentence? I might have misunderstood, but a technology company manager could fail to prevent a child or young person encountering legal but none the less allegedly harmful speech, be considered in breach of these amendments and get sent to prison. We have to be very careful that we understand what this harmful speech is, as we discussed previously. The threshold for harm, which encompasses physical and psychological harm, is vast and could mean people going to prison without the precise criminal offence being clear. We talked previously about VPNs. If a tech savvy 17-year-old uses a VPN and accesses some of this harmful material, will someone potentially be criminally liable for that young person getting around the law, find themselves accused of dereliction of duty and become a criminal?
My final question is on penalties. When I was looking at this Bill originally and heard about the eye-watering fines that some Silicon Valley companies might face, I thought, “That will destroy them”. Of course, to them it is the mere blink of an eye, and I do get that. This indicates to me, given the endless conversations we have had on whether size matters, that in this instance size does matter. The same kind of liabilities will be imposed not just on the big Silicon Valley monsters that can bear these fines, but on Mumsnet—or am I missing something? Mumsnet might not be the correct example, but could not smaller platforms face similar liabilities if a young person inadvertently encounters harmful material? It is not all malign people trying to do this; my unintended consequence argument is that I do not want to create criminals when a crime is not really being committed. It is a moral dilemma, and I do understand the issue of enforcement.
I rise very much to support the comments of my noble friend Lord Bethell and, like him, to thank the Minister for bringing forward the government amendments. I will try to address some of the comments the noble Baroness, Lady Fox, has just made.
One must view this as an exercise in working out how one drives culture change in some of the biggest and most powerful organisations in the world. Culture change is really hard. It is hard enough in a company of 10 people, let alone in a company with hundreds of thousands of employees across the world that has more money than a single country. That is what this Bill requires these enormous companies to do: to change the way they operate when they are looking at an inevitably congested, contested technology pipeline, by which I mean—to translate that out of tech speak—they have more work to do than even they can cope with. Every technology company, big or small, always has this problem: more good ideas than their technologists can cope with. They have to prioritise what to fix and what to implement. For the last 15 years, digital companies have prioritised things that drive income, but not the safety of our children. That requires a culture change from the top of the company.
(1 year, 6 months ago)
Lords ChamberI beg the pardon of the Committee. I asked about it and was misinformed; I will do as the noble Baroness says.
The noble Viscount, Lord Colville, is unable to be with us. He put his name to Amendments 273, 275, 277 and 280. His concern is that the Bill sets the threshold for illegality too low and that in spite of the direction provided by Clause 170, the standards for determining illegality are too vague.
I will make a couple of points on that thought. Clause 170(6) directs that a provider must have
“reasonable grounds to infer that all elements necessary for the commission of the offence, including mental elements, are present or satisfied”,
but that does not mean that the platform has to be certain that the content is illegal before it takes it down. This is concerning when you take it in combination with what or who will make judgments on illegality.
If a human moderator makes the decision, it will depend on the resources and time available to them as to how much information they gather in order to make that judgment. Unlike in a court case, when a wide range of information and context can be gathered, when it comes to decisions about content online, these resources are very rarely available to human moderators, who have a vast amount of content to get through.
If an automated system makes the judgment, it is very well established that algorithms are not good at context—the Communications and Digital Committee took evidence on this repeatedly when I was on it. AI simply uses the information available in the content itself to make a decision, which can lead to significant missteps. Clause 170(3) provides the requirement for the decision-makers to judge whether there is a defence for the content. In the context of algorithms, it is very unclear how they will come to such a judgment from the content itself.
I understand that these are probing amendments, but I think the concern is that the vagueness of the definition will lead to too much content being taken down. This concern was supported by Parliament’s Joint Committee on Human Rights, which wrote to the former Culture Secretary, Nadine Dorries, on that matter. I apologise again.
My Lords, I support the amendments in this group that probe how removing illegal material is understood and will be used under the Bill. The noble Lord, Lord Moylan, explained a lot of my concerns, as indeed did the noble Viscount, Lord Colville, via his avatar. We have heard a range of very interesting contributions that need to be taken seriously by the Government. I have put my name to a number of amendments.
The identification of illegal material might be clear and obvious in some cases—even many cases. It sounds so black and white: “Don’t publish illegal material”. But defining communications of this nature can be highly complex, so much so that it is traditionally reserved for law enforcement bodies and the judicial system. We have already heard from the noble Lord, Lord Moylan, that, despite Home Secretaries, this House, regulations and all sorts of laws having indicated that non-crime hate incidents, for example, should not be pursued by the police, they continue to pursue them as though they are criminal acts. That is exactly the kind of issue we have.
In talking about individuals and investigations, the noble Baroness reminded me of one class of content where we do have clarity, and that is contempt of court. That is a frequent request. We know that it is illegal in that case because a judge writes to the company and says, “You must not allow this to be said because it is in contempt of court”, but that really is the exception. In most other cases, someone is saying, “I think it is illegal”. In live proceedings, in most cases it is absolutely clear because a judge has told you.
That is very helpful.
I am concerned that removing so-called illegal content for the purpose of complying with the regulatory system covers not only that which reaches conviction in a criminal court but possibly anything that a platform determines could be illegal, and therefore it undermines our own legal system. As I have said, that marks a significant departure from the rule of law. It seems that the state is asking or mandating private companies to make determinations about what constitutes illegality.
The obligations on a platform to determine what constitutes illegality could obviously become a real problem, particularly in relation to limitations on free expression. As we have already heard, the Public Order Act 1986 criminalises, for example, those who stir up hatred through the use of words, behaviour or written material. That is contentious in the law offline. By “contentious”, I mean that it is a matter of difficulty that requires the full rigour of the criminal justice system, understanding the whole history of established case law. That is all necessary to make a conviction under that law for offences of this nature.
Now we appear to be saying that, without any of that, social media companies should make the decision, which is a nerve-racking situation to be in. We have already heard the slippery phrase “reasonable grounds to infer”. If that was the basis on which you were sent to prison—if they did not have to prove that you were guilty but they had reasonable grounds to infer that you might be, without any evidence—I would be worried, yet reasonable grounds to infer that the content could be illegal is the basis on which we are asking for those decisions to be made. That is significantly below the ordinary burden of proof required to determine that an illegal act has been committed. Under this definition, I fear that platforms will be forced to overremove and censor what ultimately will be entirely lawful speech.
Can the Minister consider what competency social media companies have to determine what is lawful? We have heard some of the dilemmas from somebody who was in that position—let alone the international complications, as was indicated. Will all these big tech companies have to employ lots of ex-policemen and criminal lawyers? How will it work? It seems to me that there is a real lack of qualifications in that sphere— that is not a criticism, because those people decided to work in big tech, not in criminal law, and yet we are asking them to pursue this. That is a concern.
I will also make reference to what I think are the controversies around government Amendments 136A and 136B to indicate the difficulties of these provisions. They concern illegal activity—such as “assisting unlawful immigration”, illegal entry, human trafficking and similar offences—but I am unsure as to how this would operate. While it is the case that certain entrances to the UK are illegal, I suddenly envisage a situation where a perfectly legitimate political debate—for example, about the small boats controversy—would be taken down, and that people advocating for a position against the Government’s new Illegal Migration Bill could be accused of supporting illegality. What exactly will be made illegal in those amendments to the Online Safety Bill?
The noble Baroness, Lady Buscombe, made a fascinating speech about an interesting group of amendments. Because of the way the amendments are grouped, I feel that we have moved to a completely different debate, so I will not go into any detail on this subject. Anonymous trolling, Twitter storms and spreading false information are incredibly unpleasant. I am often the recipient of them—at least once a week—so I know personally that you feel frustrated that people tell lies and your reputation is sullied. However, I do not think that these amendments offer the basis on which that activity should be censored, and I will definitely argue against removing anonymity clauses—but that will be in another group. It is a real problem, but I do not think that the solution is contained in these amendments.
My Lords, my contribution will be less officious than my intervention earlier in this group. In the last couple of years since I returned to the House—as I describe it—having spent time at the Charity Commission, I have noticed a new practice emerging of noble Lords reading out other people’s speeches. Every time I had seen it happen before, I had not said anything, but today I thought, “I can’t sit here and not say anything again”. I apologise for my intervention.
I am grateful to my noble friend Lord Moylan for bringing forward his amendments and for introducing them in the incredibly clear way he did; they cover some very complex and diverse issues. I know that there are other amendments in the group which might be described as similar to his.
There are a couple of things I want to highlight. One interesting thing about the debate on this group is the absence of some of our legal friends—I apologise to my noble friend Lady Buscombe, who is of course a very distinguished lawyer. The point I am making is that we are so often enriched by a lot of legal advice and contributions on some of the more challenging legal issues that we grapple with, but we do not have that today, and this is a very difficult legal issue.
It is worth highlighting again, as has been touched on a little in some of the contributions, the concern, as I understand it, with how the Bill is drafted in relation to illegal content and the potential chilling effect of these clauses on social media platforms. As has already been said, there is a concern that it might lead them to take a safety-first approach in order to avoid breaking the law and incurring the sanctions and fines that come with the Bill, which Ofcom will have the power to apply. That is the point we are concerned with here. It is the way in which this is laid out, and people who are much better equipped than I am have already explained the difference between evidence versus reasonable grounds to infer.
What the noble Lord, Lord Allan, hit on in his contribution is also worth taking into account, and that is the role of Ofcom in this situation. One of the things I fear, as we move into an implementation phase and the consequences of the Bill start to impact on the social media firms, is the potential for the regulator to be weaponised in a battle on the cultural issues that people are becoming increasingly exercised about. I do not have an answer to this, but I think it is important to understand the danger of where we might get to in the expectations of the regulator if we create a situation where the social media platforms are acting in a way that means people are looking for recourse or a place to generate further an argument and a battle that will not be helpful at all.
I am not entirely sure, given my lack of legal expertise —this is why I would have been very grateful for some legal expertise on this group—whether what my noble friend is proposing in his amendments is the solution, but I think we need to be very clear that this is a genuine problem. I am not sure, as things stand in the Bill, that we should be comfortable that it is not going to create problems. We need to find a way to be satisfied that this has been dealt with properly.
I want to clarify one point. I have had a slightly different experience, which is that for many people—women, at least—whom I have talked to recently, there is an over-enthusiasm and an over-zealous attitude to policing the speech of particular women and, as we have already heard, gender-critical women. It is often under the auspices of hate speech and there is all sorts of discussion about whether the police are spending too long trawling through social media. By contrast, if you want to get a policeman or policewoman involved in a physical crime in your area, you cannot get them to come out. So I am not entirely convinced. I think policing online speech at least is taking up far too much of the authorities’ time, not too little time, and distracting them from solving real social and criminal activity.
I defer to the noble Baroness, Lady Fox, on speech crime. That is not the area of my expertise, and it is not the purpose of my points. My points were to do with the kinds of crime that affect children in particular. His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services is very specific about that point. It says that “unacceptable delays are commonplace” and it gives a very large number of case studies. I will not go through them now because it is Thursday afternoon, but I think noble Lords can probably imagine the kinds of things we are talking about. They include years of delay, cases not taken seriously or overlooked, evidence lost, and so forth. The report found that too often children were put at risk because of this, and offenders were allowed to escape justice, and it gave 17 recommendations for how the police force should adapt in order to meet this challenge.
So my questions to the Minister are these. When we talk about things such as age verification for hardcore porn, we are quite often told that we do not need to worry about some of this because it is covered by illegal content provisions, and we should just leave it to the police to sort out. His Majesty’s Inspectorate gives clear evidence—this is a recent report from last month—that this is simply not happening in the way it should be. I therefore wondered what, if anything, is in the Bill to try to close down this particular gap. That would be very helpful indeed.
If it is really not for the purposes of this Bill at all—if this is actually to do with other laws and procedures, other departments and the way in which the resources for the police are allocated, as the noble Baroness, Lady Fox, alluded to—what can the Government do outside the boundaries of this legislation to mobilise the police and the prosecution services to address what I might term “digital crimes”: that is, crimes that would be followed up with energy if they occurred in the real world but, because they are in the digital world, are sometimes overlooked or forgotten?
My Lords, I really appreciated the contribution from the noble Baroness, Lady Ritchie of Downpatrick, because she asked a lot of questions about this group of amendments. Although I might be motivated by different reasons, I found it difficult to fully understand the impact of the amendments, so I too want to ask a set of questions.
Harm is defined in the Bill as “physical or psychological harm”, and there is no further explanation. I can understand the frustration with that and the attempts therefore to use what are described as the
“widely understood and used 4 Cs of online risk to children”.
They are not widely understood by me, and I have ploughed my way through it. I might well have misunderstood lots in it, but I want to look at and perhaps challenge some of the contents.
I was glad that Amendment 20 recognises the level of risk of harm to different age groups. That concerns me all the time when we talk about children and young people, and then end up treating four year-olds, 14 year-olds and 18 year-olds. I am glad that that is there, and I hope that we will look at it again in future.
I want to concentrate on Amendment 93 and reflect and comment more generally on the problem of a definition, or a lack of definition, of harm in the Bill. For the last several years that we have been considering bringing this Bill to this House and to Parliament, I have been worried about the definition of psychological harm. That is largely because this category has become ever more expansive and quite subjective in our therapeutic age. It is a matter of some discussion and quite detailed work by psychologists and professionals, who worry that there is an expanding concept group of what is considered harmful and what psychological harm really means.
As an illustration, I was invited recently to speak to a group of sixth-formers and was discussing things such as trigger warnings and so on. They said, “Well, you know, you’ve got to understand what it’s like”—they were 16 year-olds. “When we encounter certain material, it makes us have PTSD”. I was thinking, “No, it doesn’t really, does it?” Post-traumatic stress disorder is something that you might well gain if you have been in the middle of a war zone. The whole concept of triggering came from psychological and medical insights from the First World War, which you can understand. If you hear a car backfiring, you think it is somebody shooting at you. But the idea here is that we should have trigger warnings on great works of literature and that if we do not it will lead to PTSD.
I am not being glib, because an expanded, elastic and pathologised view of harm is being used quite cavalierly and casually in relation to young people and protecting them, often by the young people themselves. It is routinely used to close down speech as part of the cancel culture wars, which, as noble Lords know, I am interested in. Is there not a danger that this concept of harm is not as obvious as we think, and that the psychological harm issue makes it even more complicated?
The other thing is that Amendment 93 says:
“The harms in this Schedule are a non-exhaustive list of categories and other categories may be relevant”.
As with the discussion on whose judgment decides the threshold for removing illegal material, I think that judging what is harmful is even more tricky for the young in relation to psychological harm. I was reminded of that when the noble Baroness, Lady Kidron, complained that what she considered to be obviously and self-evidently harmful, Meta did not. I wondered whether that is just the case with Meta, or whether views will differ when it comes to—
The report found—I will not give a direct quotation—that social media contributed to the death of Molly Russell, so it was the court’s judgment, not mine, that Meta’s position was indefensible.
I completely understand that; I was making the point that there will be disagreements in judgments. In that instance, it was resolved by a court, but we are talking about a situation where I am not sure how the judgment is made.
In these amendments, there are lists of particular harms—a variety are named, including self-harm—and I wanted to provide some counterexamples of what I consider to be harms. I have been inundated by algorithmic adverts for “Naked Education” on Channel 4, maybe because of the algorithms I am on. I think that the programme is irresponsible; I say that having watched it, rather than just having read a headline. Channel 4 is posing this programme with naked adults and children as educational by saying that it is introducing children to the naked body. I think it is harmful for children and that it should not be on the television, but it is advertised on social media—I have seen quite a lot of it.
The greatest example of self-harm we encounter at present is when gender dysphoric teenagers—as well as some younger than teenagers; they are predominately young women—are affirmed by adults, as a kind of social contagion, into taking body-changing and body-damaging hormones and performing self-mutilation, whether by breast binding or double mastectomies, which is advertised and praised by adults. That is incredibly harmful for young people, and it is reflected online at lot, because much of this is discussed, advertised or promoted online.
This is related to the earlier contributions, because I am asking: should those be added to the list of obvious harms? Although not many noble Lords are in the House now, if there were many more here, they would object to what I am saying by stating, “That is not harmful at all. What is harmful is what you’re saying, Baroness Fox, because you’re causing psychological harm to all those young people by being transphobic”. I am raising these matters because we think we all agree that there is a consensus on what is harmful material online for young people, but it is not that straightforward.
The amendment states that the Bill should target any platform that posts
“links to, or … encourages child users to seek”
out “dangerous or illegal activity”. I understand “illegal activity”, but on “dangerous” activities, I assume that we do not mean extreme sports, mountain climbing and so on, which are dangerous—that comes to mind probably because I have spent too much time with young people who spend their whole time looking at those things. I worry about the unintended consequences of things being banned or misinterpreted in that way.
To respond briefly to the noble Baroness, I shall give a specific example of how Amendment 93 would help. Let us go back to the coroner’s courtroom where the parents of Molly Russell were trying to get the coroner to understand what had happened to their daughter. The legal team from Meta was there, with combined salaries probably in seven figures, and the argument was about the detail of the content. At one point, I recall Ian Russell saying that one of the Meta lawyers said, “We are topic agnostic”. I put it to the noble Baroness that, had the provisions in Amendment 93 been in place, first, under “Content harms” in proposed new paragraph 3(c) and (d), Meta would have been at fault; under “Contact harms” in proposed new paragraph 4(b), Meta would have been at fault; under “Conduct harms” in proposed new paragraph 5(b), Meta would have been at fault; and under “Commercial harms” in proposed new paragraph 6(a) and (b), Meta would have been at fault. That would have made things a great deal simpler.
I appreciate that that this is the case we all have in the back of our minds. I am asking whether, when Meta says it is content agnostic, the Bill is the appropriate place for us to list the topics that we consider harmful. If we are to do that, I was giving examples of contentious, harmful topics. I might have got this wrong—
I will answer the noble Baroness more completely when I wind up, but I just want to say that she is missing the point of the schedule a little. Like her, I am concerned about the way we concentrate on content harms, but she is bringing it back to content harms. If she looks at it carefully, a lot of the provisions are about contact and conduct: it is about how the system is pushing children to do certain things and pushing them to certain places. It is about how things come together, and I think she is missing the point by keeping going back to individual pieces of content. I do not want to take the place of the Minister, but this is a systems and processes Bill; it is not going to deal with individual pieces of content in that way. It asks, “Are you creating these toxic environments for children? Are you delivering this at scale?” and that is the way we must look at this amendment.
I will finish here, because we have to get on, but I did not introduce content; it is in the four Cs. One of the four Cs is “content” and I am reacting to amendments tabled by the noble Baroness. I do not think I am harping on about content; I was responding to amendments in which content was one of the key elements.
(1 year, 6 months ago)
Lords ChamberMy Lords, I rise to speak to Amendment 205 in my name, but like other noble Lords I will speak about the group as a whole. After the contributions so far, not least from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Bennett of Manor Castle, there is not a great deal left for me to add. However, I will say that we have to understand that privacy is contextual. At one extreme, I know the remarks I make in your Lordships’ House are going to be carefully preserved and cherished; for several centuries, if not millennia, people will be able to see what I said today. If I am in my sitting room, having a private conversation, I expect that not to be heard by somebody, although at the same time I am dimly aware that there might be somebody on the other side of the wall who can hear what I am saying. Similarly, I am aware that if I use the telephone, it is possible that somebody is listening to the call. Somebody may have been duly authorised to do so by reference to a tribunal, having taken all the lawful steps necessary in order to listen to that call, because there are reasons that have persuaded a competent authority that the police service, or whatever, listening to my telephone call has a reason to do so, to avoid public harm or meet some other justified objective agreed on through legislation.
Here, we are going into a sphere of encryption where one assumes privacy and feels one is entitled to some privacy. However, it is possible that the regulator could at any moment step in and demand records from the past—records up to that point—without the intervention of a tribunal, as far as I can see, or without any reference to a warrant or having to explain to anybody their basis for doing so. They would be able to step in and do it. This is the point made by the noble Baroness, Lady Bennett of Manor Castle: unlike the telephone conversation, where it does not have to be everyone, everywhere, all the time—they are listening to just me and the person to whom I am talking—the provider has to have the capacity to go back, get all those records and be able to show Ofcom what it is that Ofcom is looking for. To do that requires them to change their encryption model fundamentally. It is not really possible to get away from everyone, everywhere, all the time, because the model has to be changed in order to do it.
That is why this is such an astonishing thing for the Government to insert in this Bill. I can understand why the security services and so forth want this power, and this is a vehicle to achieve something they have been trying to achieve for a long time. But there is very strong public resistance to it, and it is entirely understood, and to do it in this space is completely at odds with the way in which we felt it appropriate to authorise listening in on private conversations in the past—specific conversations, with the authority of a tribunal. To do it this way is a very radical change and one that needs to be considered almost apart from the Bill, not slipped in as a mere clause and administrative adjunct to it.
My Lords, there have been some excellent speeches so far. The noble Lord, Lord Allan of Hallam, brilliantly laid out why these amendments matter, and the noble Lord, Lord Moylan, explained why this has gained popular interest outside of the House. Not everything that goes on in this House is of interest and people do not study all of the speeches made by the noble Lord, Lord Moylan, even though they are always in the public sphere, but this particular group of amendments has elicited a huge amount of discussion.
We should remember that encrypted chat has become an indispensable part of the way that we live in this country and around the world. According to the Open Rights Group it has replaced the old-fashioned wired telephone—a rather quaint phrase. The fact that the citizens of the United Kingdom think that chat services matter so much that they are used by 60% of the total population should make us think about what we are doing regarding these services.
End-to-end encryption—the most secure form of encryption available—means that your messages are stored on your phone; people feel that they are in control because they are not on some server somewhere. Even WhatsApp cannot read your WhatsApp messages; that is the point of encryption. That is why people use it: the messages are secured with a lock which only you and the recipient have the special key to unlock to read them.
Obviously, there are certain problems. Certain Government Ministers wanted to voluntarily share all of their WhatsApp messages with a journalist who would then share them with the rest of us. If your Lordships were in that group you might have thought that was a rude thing to do. People have their WhatsApp messages leaked all the time, and when it happens we all think, “Oh my God, I’m glad I wasn’t in that WhatsApp group”, because you assume a level of privacy, even though as a grown-up you need to remember that somebody might leak them. But the main point is that they are a secure form of conversation that is widely used.
Everyone has a right to private conversations. I was thinking about how, when society closed down during the lockdown period, we needed technology in order to communicate with each other. We understood that we needed to WhatsApp message or Zoom call our friends and family, and the idea that this would involve the state listening in would have appalled us—we considered that our private life.
We want to be able to chat in confidence and be confident that only ourselves and the recipients can see what we are sharing and hear what we are saying. That is true of everyday life, but there are very good specific cases to be made for its importance, ranging through everything from Iranian women resisting the regime and communicating with each other, to all the civil liberties organisations around the world that use WhatsApp. The security of knowing that you can speak without Putin listening in or that President Xi will not be sent your WhatsApp messages is important.
The Government keep assuring us that we do not need to worry, but the Bill gives Ofcom the power to require services to install tools that would require the surveillance of encrypted communications regarding child exploitation and terrorism content, for example. Advocates and people on my side argue that this is not possible without undermining encryption because, just as you cannot be half pregnant, you cannot be half encrypted once you install tools for scanning for certain content. There is a danger that we say, “We’re only doing it for those things”, but actually it would be an attack on encryption itself.
Unlike the noble Baroness, Lady Bennett of Manor Castle, I know nothing about the technical aspects of this, as noble Lords can hear from the way I am speaking about it. But I can see from a common-sense point of view what encryption is: you cannot say, “We’re only going to use it a little bit”. That is my point.
I want to tackle the issue of child abuse, because I know that it lurks around here. It is what really motivates the people who say, “It’s ok as long as we can deal with that”. This is put forward as a proposed solution to the problem of encrypted chat services that send messages of that nature and the question of what we can do about it. Of course I stress that images of child abuse and exploitation are abhorrent—that is a very important background to this conversation—but I want to draw attention to the question of what we are prepared to do about child abuse, because I think it was referred to in an earlier group. I am nervous that we are promising a silver bullet through this Bill that it will all be solved through some of these measures.
No one in the Committee or anyone standing behind us who speaks up for children thinks that this is going to be a silver bullet. It is unacceptable to suggest that we take that position. Much child abuse takes place offline and is then put online, but the exponential way in which it is consumed, created, and spread is entirely new because of the services we are talking about. Later in Committee I will explain some of the new ways in which it is creating child abuse—new forms, new technologies, new abuse.
I am sorry to interrupt the noble Baroness. I have made my feelings clear that I am not an end-to-end encryption “breaker”. There are amendments covering this; I believe some of them will come up later in the name of the noble Lord, Lord Russell, on safety by design and so on. I also agree with the noble Baroness that we need more resources in this area for the police, teachers, social workers and so on. However, I do not want child sexual abuse to be a football in this conversation.
I agree with the noble Baroness, which is precisely why I am suggesting that we need to consider whether privacy should be sacrificed totally in relation to the argument around encryption. It is difficult, and I feel awkward saying it. When I mentioned a silver bullet I was not talking about the noble Baroness or any other noble Lords present, but I have heard people say that we need this Bill because it will deal with child abuse. In this group of amendments, I am raising the fact that when I have talked about encryption with people outside of the House they have said that we need to do something to tackle the fact that these messages are being sent around. It is not just child abuse; it is also terrorism. There is a range of difficult situations.
Things can go wrong with this, and that is what I was trying to raise. For example, we have a situation where some companies are considering using, or are being asked to use, machine learning to detect nudity. Just last year, a father lost his Google account and was reported to the police for sending a naked photo of their child to the doctor for medical reasons. I am raising these as examples of the problems that we have to consider.
Child abuse is so abhorrent that we will do anything to protect children, but let me say this to the Committee, as it is where the point on privacy lies: children are largely abused in their homes, but as far as I understand it we are not as yet arguing that the state should put CCTV cameras in every home for 24/7 surveillance to stop child abuse. That does not mean that we are glib or that we do not understand the importance of child abuse; it means that we understand the privacy of your home. There are specialist services that can intervene when they think there is a problem. I am worried about the possibility of putting a CCTV camera in everyone’s phone, which is the danger of going down this route.
My final point is that these services, such as WhatsApp, will potentially leave the UK. It is important to note that. I agree with the noble Lord, Lord Allan: this is not like threatening to storm off. It is not done in any kind of pique in that way. In putting enormous pressure on these platforms to scan communications, we must remember that they are global platforms. They have a system that works for billions of people all around the world. A relatively small market such as the UK is not something for which they would compromise their billions of users around the world. As I have explained, they would not put up with it if the Chinese state said, “We have to see people’s messages”. They would just say, “We are encrypted services”. They would walk out of China and we would all say, “Well done”. There is a real, strong possibility of these services leaving the UK so we must be very careful.
I just want to add to the exchange between the noble Baronesses, Lady Kidron and Lady Fox. The noble Baroness, Lady Fox, referred to WhatsApp’s position. Again, it is important for the public out there also to understand that if someone sends them illegal material—in particular child sexual abuse material; I agree with the noble Baroness, Lady Kidron, that this is a real problem—and they report it to WhatsApp, which has a reporting system, that material is no longer encrypted. It is sent in clear text and WhatsApp will give it to the police. One of the things I am suggesting is that, rather than driving WhatsApp out of the country, because it is at the more responsible end of the spectrum, we should work with it to improve these kinds of reporting systems and put the fear of God into people so that they know that this issue is not cost-free.
As a coda to that, if you ever receive something like that, you should report it to the police straightaway because, once it is on your phone, you are liable and you have a problem. The message from here should be: if you receive it, report it and, if it is reported, make sure that it gets to the police. We should be encouraging services to put those systems in place.
The noble Lord has concluded with my conclusion, which was to say that those services will be driven out, but not because they are irresponsible around horrible, dangerous messages. They do not read our messages because they are private. However, if we ever receive anything that makes us feel uncomfortable, they should be put under pressure to act. Many of them already do and are actually very responsible, but that is different from demanding that they scan our messages and we breach that privacy.
My Lords, that last exchange was incredibly helpful. I am grateful to the noble Lord, Lord Allan, for what he just said and the way in which he introduced this group. I want to make only a few brief remarks.
I have put my name to two amendments in this group: Amendment 202 in the name of the noble Lord, Lord Stevenson, which seeks to ensure that Ofcom will be subject to the same kind of requirements and controls as exist under the Regulation of Investigatory Powers Act before issuing a technology notice
“to a regulated service which offers private messaging with end-to-end encryption”;
and Amendment 285, also in the name of the noble Lord, Lord Stevenson, and that of the noble Lord, Lord Clement-Jones. This amendment would make sure that no social media platforms or private end-to-end messaging services have an obligation generally to monitor what is going on across their platforms. When I looked at this group and the various amendments in it, those were the two issues that I thought were critical. These two amendments seemed to approach them in the most simple and straightforward manner.
Like other noble Lords, my main concern is that I do not want search and social media platforms to have an obligation to become what we might describe as thought police. I do not want private messaging firms to start collecting and storing the content of our messages so that they have what we say ready to hand over in case they are required to do so. What the noble Lord, Lord Allan, just said is an important point to emphasise. Some of us heard from senior representatives from WhatsApp a few weeks ago. I was quite surprised to learn how much they are doing in this area to co-operate with the authorities; I felt very reassured to learn about that. I in no way want to discourage that because they are doing an awful amount of good stuff.
Basically, this is such a sensitive matter, as has been said, that it is important for the Government to be clear what their policy intentions are by being clear in the Bill. If they do not intend to require general monitoring that needs to be made explicit. It is also important that, if Ofcom is to be given new investigatory powers or powers to insist on things through these technology notices, it is clear that its powers do not go beyond those that are already set out in law. As we have heard from noble Lords, there is widespread concern about this matter not just from the social media platforms and search engines themselves but from news organisations, journalists and those lobby groups that often speak out on liberty-type matters. These topics go across a wide range of interest groups, so I very much hope that my noble friend the Minister will be able to respond constructively and open-mindedly on them.
My Lords, I was not intending to intervene on this group because my noble friend Lord Stevenson will address these amendments in their entirety, but listening in to this public conversation about this group of amendments has stimulated a question that I want both to put on the record and to give the Minister time to reflect on.
If we get the issues of privacy and encrypted messaging wrong, it will push more people into using VPN—virtual private network—services. I went into the app store on my phone to search for VPN software. There is nothing wrong with such software—our parliamentary devices have it to do general monitoring and make sure that we do not use services such as TikTok—but it is used to circumnavigate much of the regulatory regime that we are seeking to put together through this Bill. When I search for VPNs in the app store, the first one that comes up that is not a sponsored, promoted advertisement has an advisory age limit of four years old. Several of them are the same; some are 17-plus but most are four-plus. Clearly, the app promotes itself very much on the basis that it offers privacy and anonymity, which are the key features of a VPN. However, a review of it says, “I wouldn’t recommend people use this because it turns out that this company sends all its users’ data to China so that it can do general monitoring”.
I am not sure how VPNs are being addressed by the Bill, even though they seem really pertinent to the issues of privacy and encryption. I would be interested to hear whether—and if we are, how—we are bringing the regulation and misuse of VPNs into scope for regulation by Ofcom.
My Lords, I would like to say something very quickly on VPN. I had a discussion with some teenagers recently, who were all prepared for this Bill—I was quite surprised that they knew a lot about it. They said, “Don’t worry, we’ve worked out how to get around it. Have you heard of VPN?” It reminded me of a visit to China, where I asked a group of students how they dealt with censorship and not being able to google. They said, “Don’t worry about it”, and showed me VPN. It is right that we draw attention to that. There is a danger of inadvertently forcing people on to the unregulated dark web and into areas that we might not imagine. That is why we have to be careful and proportionate in our response.
I did, and I am happy to say it again: yes.
Perhaps I might go back to an earlier point. When the Minister said the Government want to make sure, I think he was implying that certain companies would try to avoid obligations to keep their users safe by threatening to leave or whatever. I want it to be clear that the obligations to the users of the service are, in the instance of encrypted services, to protect their privacy, and they see that as keeping them safe. It would be wrong to make that a polar opposite. I think that companies that run unencrypted services believe that to be what their duties are—so that in a way is a clash.
Secondly, I am delighted by the clarity in the Minister’s “yes” answer, but I think that maybe there needs to be clearer communication with people outside this Chamber. People are worried about whether duties placed on Ofcom to enact certain things would lead to some breach of encryption. No one thinks that the Government intend to do this or want to spy on anyone, but that the unintended consequences of the duty on Ofcom might have that effect. If that is not going to be the case, and that can be guaranteed by the Government, and they made that clear, it would reassure not just the companies but the users of messaging services, which would be helpful.
The points the noble Baroness has just made bring me neatly to what I was about to say in relation to the question raised earlier by the noble Lord, Lord Knight of Weymouth. But first, I would say that Ofcom as a public body is subject to public law principles already, so those apply in this case.
The noble Lord, Lord Knight, asked about virtual private networks and the risk of displacing people on to VPNs or other similar alternatives. That is a point worth noting, not just in this group but as we consider all these amendments, particularly when we talk later on about age verification, pornography and so on. Services will need to think about how safety measures could be circumvented and take steps to prevent that, because they need to mitigate risk effectively. There may also be a role in enforcement action, too; Ofcom will be able to apply to the courts to require these services where appropriate to apply business disruption measures. We should certainly be mindful of the incentives for people to do that, and the example the noble Lord, Lord Knight, gave earlier is a useful lesson in the old adage “Caveat emptor” when looking at some of these providers.
I want to say a little bit about Amendments 205A and 290H in my name. Given the scale of child sexual abuse and exploitation that takes place online, and the reprehensible nature of these crimes, it is important that Ofcom has effective powers to require companies to tackle it. This brings me to these government amendments, which make small changes to the powers in Clause 110 to ensure that they are effective. I will focus particularly, in the first instance, on Amendment 290H, which ensures that Ofcom considers whether a service has features that allow content to be shared widely via another service when deciding whether content has been communicated publicly or privately, including for the purposes of issuing a notice. This addresses an issue highlighted by the Independent Reviewer of Terrorism Legislation, Jonathan Hall, and Professor Stuart Macdonald in a recent paper. The separate, technical amendment, Amendment 205A, clarifies that Clause 110(7) refers only to a notice on a user-to-user service.
Amendment 190 in the name of the noble Lord, Lord Clement-Jones, seeks to introduce a new privacy duty on Ofcom when considering whether to use any of its powers. The extensive privacy safeguards that I have already set out, along with Ofcom’s human rights obligations, would make this amendment unnecessary. Ofcom must also explicitly consult persons whom it considers to have expertise in the enforcement of the criminal law and the protection of national security, which is relevant to online safety matters in the course of preparing its draft codes. This may include the integrity and security of internet services where relevant.
Amendments 202 and 206, in the name of the noble Lord, Lord Stevenson of Balmacara, and Amendments 207, 208, 244, 246, 247, 248, 249 and 250 in the name of the noble Lord, Lord Clement-Jones, all seek to deliver privacy safeguards to notices issued under Clause 110 through additional review and appeals processes. There are already strong safeguards concerning this power. As part of the warning notice process, companies will be able to make representations to Ofcom which it is bound to consider before issuing a notice. Ofcom must also review any notice before the end of the period for which it has effect.
Amendment 202 proposes mirroring the safeguards of the investigatory powers Act when issuing notices to encrypted messaging services under this power. First, this would be inappropriate, because the powers in the investigatory powers Act serve different purposes from those in this Bill. The different legal safeguards in the investigatory powers Act reflect the potential intrusion by the state into an individual’s private communications; that is not the case with this Bill, which does not grant investigatory powers to state bodies, such as the ability to intercept private communications. Secondly, making a reference to encryption would be—
(1 year, 6 months ago)
Lords ChamberMy Lords, in moving Amendment 4, I will also speak to Amendments 6 to 8 and 12 and consequential Amendments 288 and 305, largely grouped under the heading “exemptions”. In this group I am also particularly sympathetic to Amendment 9 in the names of the noble Lords, Moylan and Lord Vaizey, and I will leave them to motivate that. I look forward to hearing from the noble Lord, Lord Knight, an explanation for his Amendment 9A.
Last Wednesday we discussed the purposes of the Bill, and there was much agreement across the Chamber on one issue at least: that we need to stay focused and make sure that an already highly complex piece of legislation does not become even more unwieldy. My concern in general is that the Bill already suffers throughout from being overly broad in its aims, resulting in restricting the online experience and expressions of everyone. This series of amendments is about trying to rein in the scope, allowing us to focus on clear targets rather than a one-size-fits-all Bill that sweeps all in its wake with perhaps unintended and damaging consequences.
The Bill creates an extraordinary set of regulatory burdens on tens of thousands of British businesses, micro-communities and tech platforms, no matter the size. The impact assessment claims that 25,000 businesses are in scope, and that is considered a conservative estimate. This implies that an extraordinary range of platforms, from Mumsnet and Wikipedia to whisky-tasting forums and Reddit, will be caught up in this Bill. Can we find a way of removing the smaller platforms from scope? It will destroy too many of them if they have to comply with the regulatory burden created with huge Silicon Valley behemoths in mind.
Let us consider some of the regulatory duties that these entities are expected to comply with. They will need to undertake extensive assessments that must be repeated whenever a product changes. They will need to proactively remove certain types of content, involving assessing the risk of users encountering each type of illegal content, the speed of dissemination and functionality, the design of the platform and the nature and severity of the risk of harms presented to individual users. This will mean assessing their user base and implementing what are effectively surveillance systems to monitor all activity on their platforms.
Let us consider what a phrase such as “prevent from encountering” would mean to a web host such as Wikipedia. It would mean that it would need to scan and proactively analyse millions of edits across 250 languages for illegality under UK-specific law and then block content in defiance of the wishes of its own user community. There is much more, of course. Rest assured, Ofcom’s guidance and risk assessment will, over time, increase the regulatory complexity and the burdens involved.
Those technological challenges do not even consider the mountain of paperwork and administrative obligations that will be hugely costly and time consuming. All that might be achievable, if onerous, for larger platforms. But for smaller ones it could prove a significant problem, with SMEs and organisations working with a public benefit remit particularly vulnerable. Platforms with the largest profits and the most staff dedicated to compliance will, as a consequence, dominate at the expense of start-ups, small companies and community-run platforms.
No doubt the Government and the Minister will assure us that the duties are not so onerous and that they are manageable and proportionate. The impact assessment estimates that implementing the Bill will cost businesses £2.5 billion over the first 10 years, but all the commentators I have read think this is likely to be a substantial underestimate, especially when we are told in the same impact assessment that the legal advice is estimated to cost £39.23 per hour. I do not know what lawyers the Government hang out with, but they appear not to have a clue about the going rate for specialist law firms.
Also, what about the internal staff time? Again, the impact assessment assumes that staff will require only 30 minutes to familiarise themselves with the requirements of the legislation and 90 minutes to read, assess and change the terms and conditions in response to the requirements. Is this remotely serious? Even working through the groups of amendments has taken me hours. It has been like doing one of those 1,000-piece jigsaws, but at least at the end of those you get to see the complete picture. Instead, I felt as though somebody had come in and thrown all the pieces into the air again. I was as confused as ever.
If dealing with groups of amendments to this Bill is complex, that is nothing on the Bill itself, which is dense and often impenetrable. Last week, the Minister helpfully kept telling us to read the Explanatory Notes. I have done that several times and I am still in a muddle, yet somehow the staff of small tech companies will conquer all this and the associated regulatory changes in an hour and a half.
Many fear that this will replicate the worst horrors of GDPR, which, according to some estimates, led to an 8% reduction in the profits of smaller firms while it had little or no effect on the profits of large tech companies. That does not even take into account the cost of the near nervous breakdowns that GDPR caused small organisations, as I know from my colleagues at the Academy of Ideas.
These amendments try to tackle this disproportionate burden on smaller platforms—those companies that are, ironically, often useful challenges and antidotes to big tech’s dominance. The amendments would exempt them unless there is a good reason for specific platforms to be in scope. Of course, cutting out those in scope may not appeal to everyone here. From looking at the ever-increasing amendments list, it seems that some noble Lords have an appetite for expanding the number of services the legislation will apply to; we have already heard the discussion about app stores and online gaming. But we should note that the Government have carved out other exemptions for certain services that are excluded from the new regulatory system. They have not included emails, SMS messages, one-to-one oral communications and so on. I am suggesting some extra exemptions and that we remove services with fewer than 1 million monthly UK users. Ofcom would have the power to issue the provider with a notice bringing them into scope, but only based on reasonable grounds, having identified a safety risk and with 30 days’ notice.
If we do not tackle this, I fear that there is a substantial, serious and meaningful risk that smaller platforms based outside and inside the UK will become inaccessible to British users. It is notable that over 1,000 US news websites blocked European users during the EU’s introduction of GDPR, if noble Lords remember. Will there be a similar response to this law? What, for example, will the US search engine DuckDuckGo conclude? The search engine emphasises privacy and refuses to gather information on its users, meaning that it will be unable to fulfil the duties contained in the Bill of identifying or tailoring search results to users based on their age. Are we happy for it to go?
I fear that this Bill will reduce the number of tech platforms operating in the UK. This is anti-competitive. I do not say that because I have a particular commitment to competition and the free market, by the way. I do so because competition is essential and important for users’ choice and empowerment, and for free speech—something I fear the Bill is threatening. Indeed, the Lords’ Communications and Digital Committee’s extensive inquiry into the implications of giving large tech companies what is effectively a monopoly on defining which speech is free concluded:
“Increasing competition is crucial to promoting freedom of expression online. In a more competitive market, platforms would have to be more responsive to users’ concerns about freedom of expression and other rights”.
That is right. If users are concerned that a platform is failing to uphold their freedom of expression, they can join a different platform with greater ease if there is a wide choice. Conversely, users who are concerned that they do not want to view certain types of material would be more easily able to choose another platform that proscribes said material in its terms and conditions.
I beg to move the amendment as a way of defending diversity, choice and innovation—and as a feeble attempt to make the Bill proportionate.
The Bill creates a substantial new role for Ofcom, but it has already substantially recruited and prepared for the effective carrying out of that new duty. I do not know whether my noble friend was in some of the briefings with officials from Ofcom, but it is very happy to set out directly the ways in which it is already discharging, or preparing to discharge, those duties. The Government have provided it with further resource to enable it to do so. It may be helpful for my noble friend to have some of those discussions directly with the regulator, but we are confident that it is ready to discharge its duties, as set out in the Bill.
I was about to say that we have already had a bit of discussion on Wikipedia. I am conscious that we are going to touch on it again in the debate on the next group of amendments so, at the risk of being marked down for repetition, which is a black mark on that platform, I shall not pre-empt what I will say shortly. But I emphasise that the Bill does not impose prescriptive, one-size-fits-all duties on services. The codes of practice from Ofcom will set out a range of measures that are appropriate for different types of services in scope. Companies can follow their own routes to compliance, so long as they are confident that they are effectively managing risks associated with legal content and, where relevant, harm to children. That will ensure that services that already use community moderation effectively can continue to do so—such as Wikipedia, which successfully uses that to moderate content. As I say, we will touch on that more in the debate on the next group.
Amendment 9, in the name of my noble friend Lord Moylan, is designed to exempt small and medium sized-enterprises working to benefit the public from the scope of the Bill. Again, I am sympathetic to the objective of ensuring that the Bill does not impose undue burdens on small businesses, and particularly that it should not inhibit services from providing valuable content of public benefit, but I do not think it would be feasible to exempt service providers deemed to be
“working to benefit the public”.
I appreciate that this is a probing amendment, but the wording that my noble friend has alighted on highlights the difficulties of finding something suitably precise and not contestable. It would be challenging to identify which services should qualify for such an exemption.
Taking small services out of scope would significantly undermine the framework established by the Bill, as we know that many smaller services host illegal content and pose a threat to children. Again, let me reassure noble Lords that the Bill has been designed to avoid disproportionate or unnecessary regulatory burdens on small and low-risk services. It will not impose a disproportionate burden on services or impede users’ access to value content on smaller services.
Amendment 9A in the name of the noble Lord, Lord Knight of Weymouth, is designed to exempt “sector specific search services” from the scope of the Bill, as the noble Baroness, Lady Merron, explained. Again, I am sympathetic to the intention here of ensuring that the Bill does not impose a disproportionate burden on services, but this is another amendment that is not needed as it would exempt search services that may pose a significant risk of harm to children, or because of illegal content on them. The amendment aims to exempt specialised search services—that is, those that allow users to
“search for … products or services … in a particular sector”.
It would exempt specialised search services that could cause harm to children or host illegal content—for example, pornographic search services or commercial search services that could facilitate online fraud. I know the noble Lord would not want to see that.
The regulatory duties apply only where there is a significant risk of harm and the scope has been designed to exclude low-risk search services. The duties therefore do not apply to search engines that search a single database or website, for example those of many retailers or other commercial websites. Even where a search service is in scope, the duties on services are proportionate to the risk of harm that they pose to users, as well as to a company’s size and capacity. Low-risk services, for example, will have minimal duties. Ofcom will ensure that these services can quickly and easily comply by publishing risk profiles for low-risk services, enabling them easily to understand their risk levels and, if necessary, take steps to mitigate them.
The noble Lord, Lord McCrea, asked some questions about the 200 most popular pornographic websites. If I may, I will respond to the questions he posed, along with others that I am sure will come in the debate on the fifth group, when we debate the amendments in the names of the noble Lord, Lord Morrow, and the noble Baroness, Lady Ritchie of Downpatrick, because that will take us on to the same territory.
I hope that provides some assurance to my noble friend Lord Moylan, the noble Baroness, Lady Fox, and others, and that they will be willing not to press their amendments in this group.
My Lords, I thank people for such a wide-ranging and interesting set of contributions. I take comfort from the fact that so many people understood what the amendments were trying to do, even if they did not fully succeed in that. I thought it was quite interesting that in the first debate the noble Lord, Lord Allan of Hallam, said that he might be a bit isolated on the apps, but I actually agreed with him—which might not do his reputation any good. However, when he said that, I thought, “Welcome to my world”, so I am quite pleased that this has not all been shot down in flames before we started. My amendment really was a serious attempt to tackle something that is a real problem.
The Minister says that the Bill is designed to avoid disproportionate burdens on services. All I can say is, “Sack the designer”. It is absolutely going to have a disproportionate burden on a wide range of small services, which will not be able to cope, and that is why so many of them are worried about it. Some 80% of the companies that will be caught up in this red tape are small and micro-businesses. I will come to the small business point in a moment.
The noble Baroness, Lady Harding, warned us that small tech businesses become big tech businesses. As far as I am concerned, that is a success story—it is what I want; is it not what we all want? Personally, I think economic development and growth is a positive thing—I do not want them to fail. However, I do not think it will ever happen; I do not think that small tech businesses will ever grow into big tech businesses if they face a disproportionate burden in the regulatory sense, as I have tried to describe. That is what I am worried about, and it is not a positive thing to be celebrated.
I stress that it is not small tech and big tech. There are also community sites, based on collective moderation. Wikipedia has had a lot of discussion here. For a Bill that stresses that it wants to empower users, we should think about what it means when these user-moderated community sites are telling us that they will not be able to carry on and get through. That is what they are saying. It was interesting that the noble Lord, Lord Clement-Jones, said that he relies on Wikipedia—many of us do, although please do not believe what it says about me. There are all of these things, but then there was a feeling that, well, Reddit is a bit dodgy. The Bill is not meant to be deciding which ones to trust in quite that way, or people’s tastes.
I was struck that the noble Baroness, Lady Kidron, said that small is not safe, and used the incel example. I am not emphasising that small is safe; I am saying that the small entities will not survive this process. That is my fear. I do not mean that the big ones are nasty and dangerous and the small ones are cosy, lovely and Wikipedia-like. I am suggesting that smaller entities will not be able to survive the regulatory onslaught. That is the main reason I raised this.
The noble Baroness, Lady Merron, said that these entities can cause great harm. I am worried about a culture of fear, in which we demonise tens of thousands of innocent tech businesses and communities and end up destroying them when we do not intend to. I tried to put in the amendment an ability for Ofcom, if there are problematic sites that are risky, to deal with them. As the Minister kept saying, low-risk search engines have been exempted. I am suggesting that low-risk small and micro-businesses are exempted, which is the majority of them. That is what I am suggesting, rather than that we assume they are all guilty and then they have to get exempted.
Interestingly, the noble Lord, Lord McCrea, asked how many pornography sites are in scope and which pornographic websites have a million or fewer users. I am glad I do not know the answer to that, otherwise people might wonder why I did. The point is that there are always going to be sites that are threatening or a risk to children, as we are discussing. But we must always bear in mind—this was the important point that the noble Lord, Lord Moylan, made—that in our absolute determination to protect children via this Bill we do not unintendedly damage society as a whole. Adult access to free speech, for example, is one of my concerns, as are businesses and so on. We should not have that as an outcome.
My Lords, as might be expected, I will speak against Amendment 26 and will explain why.
The children’s charity Barnardo’s—here I declare an interest as vice-president—has said, as has been said several times before, that children are coming across pornographic content from as young as seven. Often they stumble across the content accidentally, unwittingly searching for terms such as “sex” or “porn”, without knowing what they mean. The impact that this is having on children is huge. It is harming their mental health and distorting their perception of healthy sexual relationships and consent. That will go with them into adulthood.
Age verification for pornography and age assurance to protect children from other harms are crucial to protect children from this content. In the offline world, children are rightly not allowed to buy pornographic DVDs in sex shops but online they can access this content at the click of a button. This is why I will be supporting the amendments from the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, and am fully supportive of their age assurance and age verification schedule.
My Lords, to go back not just to the age question, the noble Lord, Lord Allan of Hallam, reminded us that community-led moderation is not just Wikipedia. What I tried to hint at earlier is that that is one of the most interesting, democratic aspects of the online world, which we should protect.
We often boast that we are a self-regulating House and that that makes us somehow somewhat superior to up the road—we are all so mature because we self-regulate; people do behave badly but we decide. It is a lesson in democracy that you have a self-regulating House, and there are parts of the online world that self-regulate. Unless we think that the citizens of the UK are less civilised than Members of the House of Lords, which I would refute, we should say that it is positive that there are self-moderating, self-regulating online sites. If you can say something and people can object and have a discussion about it, and things can be taken down, to me that is the way we should deal with speech that is inappropriate or wrong. The bulk of these amendments—I cannot remember how many there are now—are right.
I was glad that the noble Lord, Lord Moylan, said he could not understand why this grouping had happened, which is what I said earlier. I had gone through a number of groupings thinking: “What is that doing there? Am I missing something? Why is that in that place?” I think we will come back to the age verification debate and discussion.
One thing to note is that one of the reasons organisations such as Wikipedia would be concerned about age verification—and they are—is anonymity. It is something we have to consider. What is going to happen to anonymity? It is so important for journalists, civil liberty activists and whistleblowers. Many Wikipedia editors are anonymised, maybe because they are politically editing sites on controversial issues. Imagine being a Wikipedia editor from Russia at the moment—you would not want to have to say who you are. We will come back to it but it is important to understand that Amendment 26, and those who are saying that we should look at the question of age verification, are not doing so because they do not care about children and are not interested in protecting them. However, the dilemmas of any age-gating or age verification for adult civil liberties have to be considered. We have to worry that, because of an emphasis on checking age, some websites will decide to sanitise what they allow to be published to make it suitable for children, just in case they come across it. Again, that will have a detrimental impact on adult access to all knowledge.
These will be controversial issues, and we will come back to them, but it is good to have started the discussion.
My Lords, this has been a very strange debate. It has been the tail end of the last session and a trailer for a much bigger debate coming down the track. It was very odd.
We do not want to see everything behind an age-gating barrier, so I agree with my noble friend. However, as the noble Baroness, Lady Kidron, reminded us, it is all about the risk profile, and that then leads to the kind of risk assessment that a platform is going to be required to carry out. There is a logic to the way that the Bill is going to operate.
When you look at Clause 11(3), you see that it is not disproportionate. It deals with “primary priority content”. This is not specified in the Bill but it is self-harm and pornography—major content that needs age-gating. Of course we need to have the principles for age assurance inserted into the Bill as well, and of course it will be subject to debate as we go forward.
There is technology to carry out age verification which is far more sophisticated than it ever was, so I very much look forward to that debate. We started that process in Part 3 of the Digital Economy Act. I was described as an internet villain for believing in age verification. I have not changed my view, but the debate will be very interesting. As regards the tail-end of the previous debate, of course we are sympathetic on these Benches to the Wikipedia case. As we said on the last group, I very much hope that we will find a way, whether it is in Schedule 1 or in another way, of making sure that Wikipedia is not affected overly by this—maybe the risk profile that is drawn up by Ofcom will make sure that Wikipedia is not unduly impacted.
(1 year, 6 months ago)
Lords ChamberMy Lords, I too support this amendment. I was at a dinner last night in the City for a group of tech founders and investors—about 500 people in a big hotel ballroom, all focused on driving the sort of positive technology growth in this country that I think everyone wants to see. The guest speaker runs a large UK tech business. He commented in his speech that tech companies need to engage with government because—he said this as if it was a revelation—all Governments turned out not to speak with one voice and that understanding what was required of tech companies by Governments is not always easy. Business needs clarity, and anyone who has run a large or small business knows that it is not really the clarity in the detail that matters but the clarity of purpose that enables you to lead change, because then your people understand why they need to change, and if they understand why, then in each of the micro-decisions they take each day they can adjust those decisions to fit with the intent behind your purpose. That is why this amendment is so important.
I have worked in this space of online safety for more than a decade, both as a technology leader and in this House. I genuinely do not believe that business is wicked and evil, but what it lacks is clear direction. The Bill is so important in setting those guardrails that if we do not make its purpose clear, we should not be surprised if the very businesses which really do want Governments to be clear do not know what we intend.
I suspect that my noble friend the Minister might object to this amendment and say that it is already in the Bill. As others have already said, I actually hope it is. If it is not, we have a different problem. The point of an upfront summary of purpose is to do precisely that: to summarise what is in what a number of noble Lords have already said is a very complicated Bill. The easier and clearer we can make it for every stakeholder to engage in the Bill, the better. If alternatively my noble friend the Minister objects to the detailed wording of this amendment, I argue that that simply makes getting this amendment right even more important. If the four noble Lords, who know far more about this subject than I will ever do in a lifetime, and the joint scrutiny committee, which has done such an outstanding job at working through this, have got the purposes of the Bill wrong, then what hope for the rest of us, let alone those business leaders trying to interpret what the Government want?
That is why it is so important that we put the purposes of the Bill absolutely at the front of the Bill, as in this amendment. If we have misunderstood that in the wording, I urge my noble friend the Minister to come back with wording on Report that truly encapsulates what the Government want.
My Lords, I welcome this opportunity to clarify the purposes of the Bill, but I am not sure that the amendment helps as my North Star. Like the Bill, it throws up as many questions as answers, and I found myself reading it and thinking “What does that word mean?”, so I am not sure that clarity was where I ended up.
It is not a matter of semantics, but in some ways you could say—and certainly this is as publicly understood—that the name of the Bill, the Online Safety Bill, gives it its chief purpose. Yet however well-intentioned, and whatever the press releases say or the headlines print, even a word such as “safety” is slippery, because safety as an end can be problematic in a free society. My worry about the Bill is unintended consequences, and that is not rectified by the amendment. As the Bill assumes safety as the ultimate goal, we as legislators face a dilemma. We have the responsibility of weighing up the balance between safety and freedom, but the scales in the Bill are well and truly weighted towards safety at the expense of freedom before we start, and I am again not convinced the amendment weights them back again.
Of course, freedom is a risky business, and I always like the opportunity to quote Karl Marx, who said:
“You cannot pluck the rose without its thorns!”
However, it is important to recognise that “freedom” is not a dirty word, and we should avoid saying that risk-free safety is more important than freedom. How would that conversation go with the Ukrainian people who risk their safety daily for freedom? Also, even the language of safety, or indeed what constitutes the harms that the Bill and the amendments promise to keep the public safe from, need to be considered in the cultural and social context of the norms of 2023. A new therapeutic ethos now posits safety in ever-expanding pseudo-psychological and subjective terms, and this can be a serious threat to free speech. We know that some activists often exploit that concept of safety to claim harm when they merely encounter views they disagree with. The language of safety and harm is regularly used to cancel and censor opponents—and the Government know that, so much so that they considered it necessary to introduce the Higher Education (Freedom of Speech) Bill to secure academic freedom against an escalating grievance culture that feigns harm.
Part of the triple shield is a safety duty to remove illegal content, and the amendment talks about speech within the law. That sounds unobjectionable—in my mind it is far better than “legal but harmful”, which has gone—but, while illegality might sound clear and obvious, in some circumstances it is not always clear. That is especially true in any legal limitations of speech. We all know about the debates around hate speech, for example. These things are contentious offline and even the police, in particular the College of Policing, seem to find the concept of that kind of illegality confusing and, at the moment, are in a dispute with the Home Secretary over just that.
Is it really appropriate that this Bill enlists and mandates private social media companies to judge criminality using the incredibly low bar of “reasonable grounds to infer”? It gets even murkier when the legal standard for permissible speech online will be set partly by compelling platforms to remove content that contravenes their terms and conditions, even if these terms of service restrict speech far more than domestic UK law does. Big tech is being incited to censor whatever content it wishes as long as it fits in with their Ts & Cs. Between this and determining, for example, what is in filters—a whole different issue—one huge irony here, which challenges one of the purposes of the Bill, is that despite the Government and many of us thinking that this legislation will de-fang and regulate big tech’s powers, actually the legislation could inadvertently give those same corporates more control of what UK citizens read and view.
Another related irony is that the Bill was, no doubt, designed with Facebook, YouTube, Twitter, Google, TikTok and WhatsApp in mind. However, as the Bill’s own impact assessment notes, 80% of impacted entities have fewer than 10 employees. Many sites, from Wikipedia to Mumsnet, are non-profit or empower their own users to make moderation or policy decisions. These sites, and tens of thousands of British businesses of varying sizes, perhaps unintentionally, now face an extraordinary amount of regulatory red tape. These onerous duties and requirements might be actionable if not desirable for larger platforms, but for smaller ones with limited compliance budgets they could prove a significant if not fatal burden. I do not think that is the purpose of the Bill, but it could be an unintended outcome. This also means that regulation could, inadvertently, act as barrier to entry to new SMEs, creating an ever more monopolistic stronghold for big tech, at the expense of trialling innovations or allowing start-ups to emerge.
I want to finish with the thorny issue of child protection. I have said from the beginning—I mean over the many years since the Bill’s inception—that I would have been much happier if it was more narrowly titled as the Children’s Online Safety Bill, to indicate that protecting children was its sole purpose. That in itself would have been very challenging. Of course, I totally agree with Amendment 1’s intention
“to provide a higher level of protection for children than for adults”.
That is how we treat children and adults offline.
My Lords, I am one of those who found the Bill extremely complicated, but I do not find this amendment extremely complicated. It is precise, simple, articulate and to the point, and I think it gives us a good beginning for debating what is an extremely complex Bill.
I support this amendment because I believe, and have done so for a very long time, that social media has done a great deal more harm than good, even though it is capable of doing great good. Whether advertently or inadvertently, the worst of all things it has done is to destroy childhood innocence. We are often reminded in this House that the prime duty of any Government is to protect the realm, and of course it is. But that is a very broad statement. We can protect the realm only if we protect those within it. Our greatest obligation is to protect children—to allow them to grow up, so far as possible, uncorrupted by the wicked ways of a wicked world and with standards and beliefs that they can measure actions against. Complex as it is, the Bill is a good beginning, and its prime purpose must be the protection and safeguarding of childhood innocence.
The noble Lord, Lord Griffiths of Burry Port, spoke a few moments ago about the instructions he was given as a young preacher. I remember when I was training to be a lay reader in the Church of England, 60 or more years ago, being told that if you had been speaking for eight minutes and had not struck oil, stop boring. I think that too is a good maxim.
We have got to try to make the Bill comprehensible to those around the country whom it will affect. The worst thing we do, and I have mentioned this in connection with other Bills, is to produce laws that are unintelligible to the people in the country; that is why I was very sympathetic to the remarks of my noble friend Lord Inglewood. This amendment is a very good beginning. It is clear and precise. I think nearly all of us who have spoken so far would like to see it in the Bill. I see the noble Baroness, Lady Fox, rising—does she wish to intervene?
I want to explain more broadly that I am all for clarifying what the law is about and for simplicity, but that ship has sailed. We have all read the Bill. It is not simple. I do not want this amendment to somehow console us, so that we can say to the public, “This is what the Bill is about”, because it is not what the Bill is about. It is about a range of things that are not contained within the amendment—I would wish them to be removed from the Bill. I am concerned that we think this amendment will resolve a far deeper and greater problem of a complicated Bill that very few of us can grasp in its entirety. We should not con the public that it is a simple Bill; it is not.
Of course we should not. What I am saying is that this amendment is simple. If it is in the Bill, it should then be what we are aiming to create as the Bill goes through this House, with our hours of scrutiny. I shall not take part in many parts of this Bill, as I am not equipped to do so, but there are many in this House who are. Having been set the benchmark of this amendment, they can seek to make the Bill comprehensible to those of us—and that seems to include the noble Baroness, Lady Fox—who at the moment find it incomprehensible.
In a way, we are dealing with the most important subject of all: the protection of childhood innocence. We have got to err in that direction. Although I yield to no one in my passionate belief in the freedom of speech, it must have respect for the decencies of life and not be propagator of the profanities of life.
(1 year, 9 months ago)
Lords ChamberI not aware that that is being done, but that is a matter for the Department for Education. I will refer the noble Lord’s point to the department.
My Lords, just to clarify the answers to some of the questions, I think all of us can understand that using CCTV to catch criminals and help victims is something that has become the norm. But the Minister has been asked whether the new technology changes things. Secondly, is there not a danger of a creep towards the surveillance of innocent people, which would not be something that the Government would endorse or condone?
There is a hugely important role for CCTV in providing assurance for people that our streets are safe, that our public spaces are being monitored and that, if crimes are committed, the people who commit them will be captured and brought to justice. That is a great reassurance to people as they go about their lawful business.
(1 year, 9 months ago)
Lords ChamberMy Lords, the Secretary of State, Michelle Donelan, has acknowledged that protecting children is the very reason that this Bill exists. If only the Government had confined themselves to that crucial task. Instead, I worry that the Bill has ballooned and still could be a major threat to free expression of adults. I agreed with much of what the noble Baroness, Lady D’Souza, just spoke about.
Like some other noble Lords here, I am delighted that the Government have dropped the censorious “legal but harmful” clauses. It was disappointing to hear Labour MPs in the other place keen to see them restored. In this place, I have admired opposition resistance to assaults on civil liberties in, for example, the Public Order Bill. Perhaps I can appeal for consistency to be just as zealous on free speech as a foundational civil liberty. I urge those pushing versions of censoring “legal but harmful” for adults to think again.
The Government’s counter to many freedom of expression concerns is that free speech is protected in various clauses, but stating that service providers must have regard to the importance of protecting users’ rights of freedom of speech is incredibly weak and woolly, giving a second-class status whencontrasted with the operational safety duties that compel companies to remove material. Instead, we need a single comprehensive and robust statutory duty in favour of freedom of expression that requires providers to ensure that free speech is not infringed on by measures taken to comply with other duties. Also, free speech should be listed as a relevant duty for which Ofcom has to develop a code of practice.
The Bill requires providers to include safety provisions for content in their terms of service. However, no similar requirement for free speech exists. It seems ironic that a Bill that claims to be clipping the power of big tech could actually empower companies to police and censor legal material in the name of safety, via the commercial route of terms and conditions.
The Government brush off worries that big tech is being encouraged to limit what UK citizens say or read online by glibly asserting that these are private companies and that they must be free to develop their own terms of service. Surely that is disingenuous. The whole purpose of the legislation is to interfere in private companies, compelling them to adhere to duties or face huge penalties. If the Government do not trust big tech with users’ safety, why do they trust them with UK citizens’ free speech rights? Similarly, consider the user empowerment duties. If users ask that certain specified types of legal content are blocked or filtered out, such as hate or abuse, it is big tech that has the power to decide what is categorised under those headings.
Only last year, amendments put forward in this House on placing convicted sex-offending trans prisoners on the female estate were labelled online as hate-fuelled, transphobic abuse. However, with the ability to hear all sides of the debate online, and especially in the light of recent events in Scotland around the Gender Recognition Act, more and more people realise that such views are not hate but driven by concerns about safeguarding women’s rights. Would such a debate be filtered out online by overcautious labelling by big tech and the safety duties in its Ts and Cs?
Finally, like others, I am worried that the Secretary of State is given too much power—for example, to shape Ofcom’s codes of practice, which is a potential route for political interference. My concerns are fuelled by recent revelations. In the US, Elon Musk’s leaked Twitter files prove that, in the run-up to the 2020 election, Joe Biden’s presidential campaign routinely flagged up tweets and accounts that it wanted removed, influencing the suppression of the New York Post’s Hunter Biden laptop exposé. Here in the UK, only this week, a shocking Big Brother Watch report reveals that military operatives reported on online dissenting views on official Covid lockdown policies to No. 10 and the DCMS’s counter-disinformation unit, allowing Whitehall’s hotlines to giant media companies to suppress this legal content. Even the phrase “illegal” in the Bill can be politically weaponised, such as with the proposal to censor content allegedly promoting small boat crossings.
Free speech matters to democracy, and huge swathes of this Bill could threaten both unless we amend it appropriately.
(1 year, 10 months ago)
Lords ChamberMy Lords, I welcome this opportunity to discuss the regional distribution of Arts Council England funding. I thank the noble Lords, Lord Storey and Lord McNally, for the chance to raise some concerns.
First, no arts organisation should feel entitled to perpetual state funding as a right. It is totally appropriate to review and shake up which projects and whose artistic output merits public funds. But what is so striking in this funding round is that the criteria do not even pretend to be based on artistic merit at all, but seem to be purely political and, even more crassly, geographic.
The DCMS instruction to redistribute funding away from London has some winners, and I am delighted for both Blackburn and Bradford’s museums and art galleries, and for the Barnsley-based Brass Bands England, which has received funding for the first time, among many others—good luck to them. I am from the north, and it is great to say that we will support the arts in the north; I have no problem with that. But I am slightly anxious about the overall trajectory that reveals a patronising attitude to northern audiences and potentially a philistine attitude to the arts, nowhere better exemplified than in the plight of English National Opera.
Like others—in this, I uncharacteristically fully agree with the noble Lord, Lord Vaizey—I was shocked by the Arts Council’s treatment of English National Opera. Effectively, its chorus and orchestras are being closed down; they have been sacked. When the Arts Council announced the move, it did so with an ungracious and high-handed ultimatum, which I want to quote:
“ENO’s future is in their hands … We require English National Opera to move to another part of England if they wish to continue to receive support from us.”
But the financial offer it has been given is actually only half its usual budget, so I want to ask whether the Arts Council thinks that those in the north do not deserve full funding of the arts, and should make do with a cut-price, pound shop version of English National Opera.
Such cultural vandalism feels like virtue signalling, devoid of serious strategic thinking and forced through at speed. When Birmingham Royal Ballet relocated from London in the 1980s, it was undertaken with five years’ consultation with audiences, staff and its new venue home, but there has been no consultation in this instance. The move has to be completed in five months, and the Arts Council has not even bothered to consider where ENO might take up residence; it just has to go “up north”.
One venue that might work given its size is Factory International, Manchester’s soon-to-be multimillion-pound arts venue, itself a recipient of Arts Council funding. But no one asked it, and it has made it clear that it will not change its contemporary focus to accommodate the new tenant. Artistic director John McGrath stated that its goals are
“new works, not the traditional opera repertoire.”
What about the Grand, in Leeds, which has the largest stage in England outside of London? But no—it already hosts the wonderful Opera North. Indeed, the whole venture of moving ENO north seems to be a slap in the face for Opera North, the director of which, Richard Mantle, points out:
“It’s not a new idea to have a large professional opera company performing opera in the North; we’ve been doing it for 40 years”.
Somehow, in the debates about opera prompted by this ENO issue, we perhaps get a hint of what the Arts Council’s views are on both opera and its relationship to northern audiences, or to audiences in general. Darren Henley, the chief executive of the Arts Council, claims that opera needs to change to satisfy a new generation of audiences, who he claims want
“opera … presented in new ways: opera in car parks, opera in pubs, opera on your tablet.”
He suggests that such
“New ideas may seem heretic to traditionalists”.
They seem so to me. They are not novel or radical ideas, but they are cheap and second-rate gimmicks, as far as I am concerned, and they show a disparaging view of audiences and the art form. The premise seems to be the cliché that traditional opera, including some of the greatest music ever composed, appeals only to the fusty, rich upper classes and the privileged.
I am reminded of the incident last July, when the Deputy Prime Minister, Dominic Raab, accused Angela Rayner of being a champagne socialist for going to Glyndebourne, as though she were betraying somehow the working classes. I assume he was forgetting that, historically, opera has been a popular art form, enjoyed by millions of people of all social classes, all over the world. Being priced out by expensive tickets or not being able to afford to get the train to London is a problem, but it is very different from the snobbish chippiness that seems to imbue the political and artistic establishments’ implicit prejudice that the plebs will not be interested in, or get, high art. This attitude was on display recently, when the BBC announced that, in order to attract viewers from lower socioeconomic D and E groups, it will make “lighter” dramas, comedies and sports documentaries and use “factual entertainment competition formats”—yuck. It seems that, if you are poor, you will be given poor-quality programmes.
Perhaps that is too cynical, but the Arts Council director of music, Claire Mera-Nelson, has justified attacks on ENO, which, ironically, was set up nearly 100 years ago with the mission to bring opera to the masses—a noble cause. She said that there is insufficient growth in audience demand for traditionally staged large-scale opera. This seems to be a real bean-counter’s approach to the value of the arts. As acclaimed soprano Danielle de Niese asks:
“Do we need to sell as many tickets as the O2 to be recognised? … Should we declare war on everything that isn’t mainstream enough?”
She asks whether all we will be left with is “reality TV”. She then pleads with those who run the arts and politics to “recognise opera’s value” as art per se. But that seems a forlorn hope because valuing artistic excellence is often treated as an elitist endeavour by too many in arts funding and policy circles.
Since the Blair years and the setting up of the DCMS in the 1990s, arts organisations have been told that they must justify their funding using wordy social and economic criteria. “Art for art’s sake” arguments have too often been traduced as arcane, old-fashioned and self-indulgent, and a focus on aesthetics is assumed to alienate popular appeal. Arts organisations have been forced by funding carrots and sticks to show their worth as useful instruments in social and political change. It is true that many in the arts world have embraced this mission over recent years, with orchestras stressing that they are good for health and well-being and theatres opining on their role as community hubs. Often, these are defensive expressions, expressing an existential crisis in arts organisations about their role. In recent years, museums, galleries and classical music have all indulged in angst-ridden introspection about their alleged colonial roots and whiteness, and diversity and inclusion targets mean that outward engagement projects obsess over the age, skin colour and gender of audiences, rather than the artistic quality of their output.
The effect of all this has been the cumbersome politicisation of the arts world. There is too much “artivism” and propagandising and an existential crisis about the role of the arts. It is no surprise that Just Stop Oil activists feel free to desecrate artistic masterpieces to save the planet. Art is considered secondary to politics. All this emanates from the way that artistic excellence has been squeezed out of arts funding. If you look at the bureaucratic Arts Council development programmes, drenched in acronym-laden managerial speak, the intrinsic worth of art is barely visible. Utilitarianism rules the day. The creative local growth fund, the cultural development fund and the Great Place scheme all focus on local economic growth, unlocking productivity and everything. We need that urgently to happen, but it should have been in the Autumn Statement and not be forced on the arts.
As I have gone on about dumbing down, I want to finish by giving the Minister a bit of homework. I suggest that he and the Arts Council learn about the artistic tastes of ordinary people by reading The Intellectual Life of the British Working Class by Jonathan Rose, to understand the rich history of autodidacts thriving on intellectually challenging art and literature, and the new pamphlet by the artist and art critic Alexander Adams, Abolish the Arts Council, which critiques some of these instrumentalising themes. It is selling out as we speak as a stocking filler, but it is a good read.
(1 year, 12 months ago)
Lords ChamberI thank the noble Baroness for her words of welcome. She will appreciate that her final point is one for business managers rather than for me but I reiterate, having been there at the genesis of the discussions that led to the Bill, that I am very keen to see it in your Lordships’ House and to give it that thorough scrutiny. It has already been well improved because of the work of the Joint Committee of both Houses, but it needs to come to your Lordships’ House so that we can scrutinise it properly.
My Lords, the original aim of the Bill was to tackle harm to children, which we can all agree on, but it has expanded enormously and some say represents a real threat to freedom of speech for adults. Will the Minister ensure that he not only sees stakeholders working with those interested in online safety for children but meets free speech organisations and civil liberty campaigners to ensure the Bill does not become a legislative piece of censorship?
The Bill contains strong safeguards for freedom of expression. No platforms will be required to remove legal content and all services will need to have regard to freedom of expression when implementing their safety duties. Of course, although Ministers have met such groups throughout the passage of the Bill so far, I would be very happy to continue to do so to ensure that aspect of the Bill gets proper scrutiny too.
(2 years ago)
Lords ChamberMy Lords, I thank the committee for this report. Even though I do not agree with many of its recommendations, it was a real treat to read—like a great primer or literature review. There is so much of the Online Safety Bill to worry about in terms of free speech that it is hard to know where to focus, so I will just make a few points.
I was especially grateful to see a refreshingly nuanced approach in the report to misinformation, which I focused on the last time we discussed these issues. As research from Ofcom notes, many believe that the term “misinformation” is being
“weaponised for censorship of valid alternative perspectives.”
The report’s examples from the lockdown and Covid era are pertinent: for example, expert medical opinion—albeit a minority—that challenged either the Government or the World Health Organization were labelled misinformation, deemed so by big tech fact-checkers with no scientific qualifications but
“certified by the International Fact-Checking Network”—
whatever that is. It is all the more important to note, as the report does, that even Will Moy, the CEO of Full Fact, has said:
“There is a moral panic about ‘fake news’”,
leading to “frightening overreactions” by Governments and big tech.
I was also glad that the report noted the broader context of what I think is in danger of being a potential moral panic about online safety. Concerns from free-speechers are based on the offline problems of cancel culture and the ever-growing attacks on, for example, academic freedom in universities—such that the Government are attempting to legislate to enhance free expression on campus at the same time as undermining free expression online.
I will add another offline context: there is a contemporary therapeutic ethos that posits safety—especially psychological safety—as trumping freedoms of any sort. I hope that the committee will look at this at some stage. We cannot discuss online harms without understanding that the concept of harm is an ever-expanding category.
Before I look at that, I will make one clarification: whenever I raise problems with the Bill, the justifications that come back at me always centre on children’s safety. I note that I would be happy if the Online Safety Bill confined its focus on the young and children. Instead, the Government use adult worries about children’s access to porn, self-harm and suicides—all right worries—to introduce huge legislative changes that will affect adult freedoms, effectively infantilising citizens and treating us as dependent children in need of protection from each other’s speech.
The report tells us:
“Civilised societies have legal safeguards to protect those who may be vulnerable.”
The problem is when vulnerability gets discussed in relation to adults. In a therapeutic culture, vulnerability and victimhood are valorised and often incentivised because, if we present ourselves as fragile and vulnerable, we have a cultural currency and power not only to gain attention and support but to silence others. For example, the report is extremely helpful in deconstructing the whole concept of harm: the committee rightly rails against the illiberal notion of censoring “legal but harmful” material, and hopefully the Government will indeed drop that egregious clause. The whole premise of the Bill is based on the idea that speech online can be, and often is, harmful. The elastic use of the term “harm” makes it ill-defined and subjective, fudging physical harm with psychological harm—and it is no wonder that many now see words as violence.
The committee helpfully asked the Government whether the
“Bill’s definition of psychological impact has any clinical basis”.
The reply came back saying, “No”; it would be up to “platforms … to make judgements” about speech causing anxiety or fear. This is potentially disastrous, as terms such as “offensive”, “hate”, and “misinformation”—with all their subjectivity—can be said by individuals to mean that something should be banned.
The report notes that, a few years ago,
“the Christian Union at Balliol College … was banned from its freshers’ fair”,
on the basis that
“its presence could ‘harm”’ some attendees.”
Goodness knows what they would make of the harm of having Bishops in this place. Only this week, Cambridge University faculty heads apologised to students for “distressing” them by sending an email promotion for a “potentially harmful” talk. What caused such alarm? A talk by Sex Matters’ Helen Joyce entitled, “Criticising gender-identity ideology: what happens when speech is silenced”—oh the irony. Actually, much speech is silenced, online and offline, by deploying the language of psychology to suggest that speech, books and ideas are dangerous. Trigger warnings are put on lectures and literature to prevent post-traumatic stress disorder. PTSD is now not clinically diagnosed post war or after a disaster, but by the potential harms caused by upsetting speech or words. So even if “harm” in the Bill is medically diagnosed, it will not help because psychological language is now frequently used to silence us.