All 4 Debates between Lord Clement-Jones and Baroness Fox of Buckley

Thu 25th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1
Tue 9th May 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 2
Tue 25th Apr 2023
Online Safety Bill
Lords Chamber

Committee stage: Part 1

Online Safety Bill

Debate between Lord Clement-Jones and Baroness Fox of Buckley
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I was suggesting that we have a bigger problem than it appearing on a small site. It quotes from mainstream media, but it ends up being broadly disseminated and not because it is on a small site. I am not advocating that we all go round carrying the manifestos of mass shooters and legitimising them. I was more making the point that it can be complicated. Would not the solution be that you can make appeals that a small site is treated differently? That is the way we deal with harmful material in general and the way we have dealt with, for example, RT as press without compromising on press freedom. That is the kind of point I am trying to make.

I understand lots of concerns but I do not want us to get into a situation where we destroy the potential of all smaller platforms—many of them doing huge amounts of social good, part of civil society and all the rest of it—by treating them as though they are large platforms. They just will not have the resources to survive, that is all my point is.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I am going to be extremely brief given the extremely compelling way that these amendments have been introduced by the noble Baroness, Lady Morgan, and the noble Lord, Lord Griffiths, and contributed to by the noble Baroness, Lady Bull. I thank her for her comments about my noble friend Lady Parminter. I am sure she would have wanted to be here and would have made a very valuable contribution as she did the other day on exactly this subject.

As the noble Baroness, Lady Fox, has illustrated, we have a very different view of risk across this Committee and we are back, in a sense, into that whole area of risk. I just wanted to say that I think we are again being brought back to the very wise words of the Joint Committee. It may sound like special pleading. We keep coming back to this, and the noble Lord, Lord Stevenson, and I are the last people standing on a Thursday afternoon.

We took a lot of evidence in this particular area. We took the trouble to go to Brussels and had a very useful discussion with the Centre on Regulation in Europe and Dr Sally Broughton Micova. We heard a lot about interconnectedness between some of these smaller services and the impact in terms of amplification across other social media sites.

We heard in the UK from some of the larger services about their concerns about the activities of smaller services. You might say “They would say that, wouldn’t they?” but they were pretty convincing. We heard from HOPE not Hate, the Antisemitism Policy Trust and Stonewall, stressing the role of alternative services.

Of course, we know that these amendments today—some of them sponsored by the Mental Health Foundation, as the noble Lord, Lord Griffiths, said, and Samaritans—have a very important provenance. They recognise that these are big problems. I hope that the Minister will think strongly about this. The injunction from the noble Lord, Lord Allan, to consider how all this is going to work in practice is very important. I very much hope that when we come to consider how this works in practical terms that the Minister will think very seriously about the way in which risk is to the fore— the more nuanced approach that we suggested—and the whole way that profiling by Ofcom will apply. I think that is going to be extremely important as well. I do not think we have yet got to the right place in the Bill which deals with these risky sites. I very much hope that the Minister will consider this in the quite long period between now and when we next get together.

Online Safety Bill

Debate between Lord Clement-Jones and Baroness Fox of Buckley
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, this group of amendments looks at the treatment of legal content accessed by adults. The very fact that Parliament feels that legislation has a place in policing access to legal material is itself worrying. This door was opened by the Government in the initial draft Bill, but, as we have already heard, after a widespread civil liberties backlash against the legal but harmful clauses, we are left with Clause 65. As has been mentioned, I am worried that this clause, and some of the amendments, might well bring back legal but harmful for adults by the back door. One of the weasel words here is “harmful”. As I have indicated before, it is difficult to work out from the groupings when to raise which bit, so I am keeping that for your Lordships until later and will just note that I am rather nervous about the weasel word “harmful”.

Like many of us, I cheered at the removal of the legal but harmful provisions, but I have serious reservations about their replacement with further duties via terms of service, which imposes a duty on category 1 services to have systems and processes in place to take down or restrict access to content, and to ban or suspend users in accordance with terms of service, as the noble Lord, Lord Moylan, explained. It is one of the reasons I support his amendment. It seems to me to be the state outsourcing the grubby job of censorship to private multinational companies with little regard for UK law.

I put my name to Amendment 155 in the name of the noble Lord, Lord Moylan, because I wanted to probe the Government’s attitude to companies’ terms of service. Platforms have no obligation to align their terms of service with freedom of expression under UK law. It is up to them. I am not trying to impose on them what they do with their service users. If a particular platform wishes to say, “We don’t want these types of views on our platform”, fine, that is its choice. But when major platforms’ terms of service, which are extensive, become the basis on which UK law enforces speech, I get nervous. State regulators are to be given the role of ensuring that all types of lawful speech are suppressed online, because the duty applies to all terms of service, whatever they are, regarding the platforms’ policies on speech suppression, censorship, user suspension, bans and so on. This duty is not restricted to so-called harmful content; it is whatever content the platform wishes to censor.

What is more, Clause 65 asks Ofcom to ensure that individuals who express lawful speech are suspended or banned from platforms if in breach of the platforms’ Ts & Cs, and that means limiting those individuals from expressing themselves more widely, beyond the specific speech in question. That is a huge green light to interfere in UK citizens’ freedom of expression, in my opinion.

I stress that I am not interested in interfering in the terms and conditions of private companies, although your Lordships will see later that I have an amendment demanding that they introduce free-speech clauses. That is because of the way we seem to be enacting the law via the terms of service of private companies. They should of course be free to dictate their own terms of service, and it is reasonable that members of the public should know what they are and expect them to be upheld. But that does not justify the transformation of these private agreements into statutory duties—that is my concern.

So, why are we allowing this Bill to ask companies to enforce censorship policies in the virtual public square that do not exist in UK law? When companies’ terms of service permit the suppression of speech, that is up to them, but when they supress speech far beyond the limitations of speech in UK law and are forced to do so by a government regulator such as Ofcom, are we not in trouble? It means that corporate terms of service, which are designed to protect platforms’ business interests, are trumping case law on free speech that has evolved over many years.

Those terms of service are also frequently in flux, according to fashion or ownership; one only has to look at the endless arguments, which I have yet to understand, about Twitter’s changing terms of service after the Elon Musk takeover. Is Ofcom’s job to follow Elon Musk’s ever-changing terms of service and enforce them on the British public as if they are law?

The terms and conditions are therefore no longer simply a contract between a company and the user; their being brought under statute means that big tech will be exercising public law functions, with Ofcom as the enforcer, ensuring that lawful speech is suppressed constantly, in line with private companies’ terms of service. This is an utter mess and not in any way adequate to protect free speech. It is a fudge by the Government: they were unpopular on “lawful but harmful”, so they have outsourced it to someone else to do the dirty work.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, it has been interesting to hear so many noble Lords singing from the same hymn sheet—especially after this weekend. My noble friend Lord McNally opened this group by giving us his wise perspective on the regulation of new technology. Back in 2003, as he mentioned, the internet was not even mentioned in the Communications Act. He explained how regulation struggles to keep up and how quantum leaps come with a potential social cost; all that describes the importance of risk assessment of these novel technologies.

As we have heard from many noble Lords today, on Report in the Commons the Government decided to remove the adult safety duties—the so-called “legal but harmful” aspect of the Bill. I agree with the many noble Lords who have said that this has significantly weakened the protection for adults under the Bill, and I share the scepticism many expressed about the triple shield.

Right across the board, this group of amendments, with one or two exceptions, rightly aims to strengthen the terms of service and user empowerment duties in the Bill in order to provide a greater baseline of protection for adults, without impinging on others’ freedom of speech, and to reintroduce some risk-assessment requirement on companies. The new duties will clearly make the largest and riskiest companies expend more effort on enforcing their terms of service for UK users. However, the Government have not yet presented any modelling on what effect this will have on companies’ terms of service. I have some sympathy with what the noble Lord, Lord Moylan, said: the new duties could mean that terms of service become much longer and lawyered. This might have an adverse effect on freedom of expression, leading to the use of excessive takedown measures rather than looking at other more systemic interventions to control content such as service design. We heard much the same argument from the noble Baroness, Lady Fox. They both made a very good case for some of the amendments I will be speaking to this afternoon.

On the other hand, companies that choose to do nothing will have an easier life under this regime. Faced with stringent application of the duties, companies might make their terms of service shorter, cutting out harms that are hard to deal with because of the risk of being hit with enforcement measures if they do not. Therefore, far from strengthening protections via this component of the triple shield, the Bill risks weakening them, with particular risks for vulnerable adults. As a result, I strongly support Amendments 33B and 43ZA, which my noble friend Lord McNally spoke to last week at the beginning of the debate on this group.

Like the noble Baroness, Lady Kidron, I strongly support Amendments 154, 218 and 160, tabled by the noble Lord, Lord Stevenson, which would require regulated services to maintain “adequate and appropriate” terms of service, including provisions covering the matters listed in Clause 12. Amendment 44, tabled by the right reverend Prelate the Bishop of Oxford and me, inserts a requirement that services to which the user empowerment duties apply

“must make a suitable and sufficient assessment of the extent to which they have carried out the duties in this section including in each assessment material changes from the previous assessment such as new or removed user empowerment features”.

The noble Viscount, Lord Colville, spoke very well to that amendment, as did the noble Baronesses, Lady Fraser and Lady Kidron.

Amendment 158, also tabled by me and the right reverend Prelate, inserts a requirement that services

“must carry out a suitable and sufficient assessment of the extent to which they have carried out the duties under sections 64 and 65 ensuring that assessment reflects any material changes to terms of service”.

That is a very good way of meeting some of the objections that we have heard to Clause 65 today.

These two amendments focus on risk assessment because the new duties do not have an assessment regime to work out whether they work, unlike the illegal content and children’s duties, as we have heard. Risk assessments are vital to understanding the environment in which the services are operating. A risk assessment can reduce bureaucracy by allowing companies to rule out risks which are not relevant to them, and it can increase user safety by revealing new risks and future-proofing a regime.

The Government have not yet provided, in the Commons or in meetings with Ministers, any proper explanation of why risk assessment duties have been removed along with the previous adult safety duties, and they have not explained in detail why undertaking a risk assessment is in any way a threat to free speech. They are currently expecting adults to manage their own risks, without giving them the information they need to do so. Depriving users of basic information about the nature of harms on a service prevents them taking informed decisions as to whether they want to be on it at all.

Without these amendments, the Bill cannot be said to be a complete risk management regime. There will be no requirement to explain to Ofcom or to users of a company’s service the true nature of the harms that occur on its service, nor the rationale behind the decisions made in these two fundamental parts of the service. This is a real weakness in the Bill, and I very much hope that the Minister will listen to the arguments being made this afternoon.

Online Safety Bill

Debate between Lord Clement-Jones and Baroness Fox of Buckley
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, the speech of the noble Baroness, Lady Buscombe, raised so many of the challenges that people face online, and I am sure that the masses who are watching parliamentlive as we speak, even if they are not in here, will recognise what she was talking about. Certainly, some of the animal rights activists can be a scourge, but I would not want to confine this to them, because I think trashing reputations online and false allegations have become the activists’ chosen weapon these days. One way that I describe cancel culture, as distinct from no-platforming, is that it takes the form of some terrible things being said about people online, a lot of trolling, things going viral and using the online world to lobby employers to get people sacked, and so on. It is a familiar story, and it can be incredibly unpleasant. The noble Baroness and those she described have my sympathy, but I disagree with her remedy.

An interesting thing is that a lot of those activities are not carried out by those who are anonymous. It is striking that a huge number of people with large accounts, well-known public figures with hundreds of thousands of followers—sometimes with more than a million—are prepared to do exactly what I described in plain sight, often to me. I have thought long and hard about this, because I really wanted to use this opportunity to read out a list and name and shame them, but I have decided that, when they go low, I will try to go at least a little higher. But subtweeting and twitchhunts are an issue, and one reason why we think we need an online harms Bill. As I said, I know that sometimes it can feel that if people are anonymous, they will say things that they would not say to your face or if you knew who they were, but I think it is more the distance of being online: even when you know who they are, they will say it to you or about you online, and then when you see them at the drinks reception, they scuttle away.

My main objection, however, to the amendment of the noble Baroness, Lady Buscombe, and the whole question of anonymity in general is that it treats anonymity as though it is inherently unsafe. There is a worry, more broadly on verification, about creating two tiers of users: those who are willing to be verified and those who are not, and those who are not somehow having a cloud of suspicion over them. There is a danger that undermining online anonymity in the UK could set a terrible precedent, likely to be emulated by authoritarian Governments in other jurisdictions, and that is something we must bear in mind.

On evidence, I was interested in Big Brother Watch’s report on some analysis by the New Statesman, which showed that there is little evidence to suggest that anonymity itself makes online discourse more febrile. It did an assessment involving tweets sent to parliamentarians since January 2021, and said there was

“little discernible difference in the nature or tone of the tweets that MPs received from anonymous or non-anonymous accounts. While 32 per cent of tweets from anonymous accounts were classed as angry according to the metric used by the New Statesman, so too were 30 per cent of tweets from accounts with full names attached.18 Similarly, 5.6 per cent of tweets from anonymous accounts included swear words, only slightly higher than the figure of 5.3 per cent for named accounts.”

It went through various metrics, but it said, “slightly higher, not much of a difference”. That is to be borne in mind: the evidence is not there.

In this whole debate, I have wanted to emphasise freedom as at least equal to, if not of greater value than, the safetyism of this Bill, but in this instance, I will say that, as the noble Baroness, Lady Bull, said, for some people anonymity is an important safety mechanism. It is a tool in the armoury of those who want to fight the powerful. It can be anyone: for young people experimenting with their sexuality and not out, it gives them the freedom to explore that. It can be, as was mentioned, survivors of sexual violence or domestic abuse. It is certainly crucial to the work of journalists, civil liberties activists and whistleblowers in the UK and around the world. Many of the Iranian women’s accounts are anonymous: they are not using their correct names. The same is true of Hong Kong activists; I could go on.

Anyway, in our concerns about the Bill, compulsory identity verification means being forced to share personal data, so there is a privacy issue for everyone, not just the heroic civil liberties people. In a way, it is your own business why you are anonymous—that is the point I am trying to make.

There are so many toxic issues at the moment that a lot of people cannot just come out. I know I often mention the gender-critical issue, but it is true that in many professions, you cannot give your real name or you will not just be socially ostracised but potentially jeopardise your career. I wrote an article during the 2016-17 days called Meet the Secret Brexiteers. It was true that many teachers and professors I knew who voted to leave had to be anonymous online or they would not have survived the cull.

Finally, I do not think that online anonymity or pseudonymity is a barrier to tracking down and prosecuting those who commit the kind of criminal activity on the internet described, creating some of the issues we are facing. Police reports show that between 2017-18, 96% of attempts by public authorities to identify anonymous users of social media accounts, their email addresses and telephone numbers, resulted in successful identification of the suspect in the investigation; in other words, the police already have a range of intrusive powers to track down individuals, should there be a criminal problem, and the Investigatory Powers Act 2016 allows the police to acquire communications data—for example, email addresses or the location of a device—from which alleged illegal anonymous activity is conducted and use it as evidence in court.

If it is not illegal but just unpleasant, I am afraid that is the world we live in. I would argue that what we require in febrile times such as these is not bans or setting the police on people but to set the example of civil discourse, have more speech and show that free speech is a way of conducting disagreement and argument without trashing reputations.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, what an unusually reticent group we have here for this group of amendments. I had never thought of the noble Baroness, Lady Fox, as being like Don Quixote, but she certainly seems to be tilting at windmills tonight.

I go back to the Joint Committee report, because what we said there is relevant. We said:

“Anonymous abuse online is a serious area of concern that the Bill needs to do more to address. The core safety objectives apply to anonymous accounts as much as identifiable ones. At the same time, anonymity and pseudonymity are crucial to online safety for marginalised groups, for whistleblowers, and for victims of domestic abuse and other forms of offline violence. Anonymity and pseudonymity themselves are not the problem and ending them would not be a proportionate response”.


We were very clear; the Government’s response on this was pretty clear too.

We said:

“The problems are a lack of traceability by law enforcement, the frictionless creation and disposal of accounts at scale, a lack of user control over the types of accounts they engage with and a failure of online platforms to deal comprehensively with abuse on their platforms”.


We said there should be:

“A requirement for the largest and highest risk platforms to offer the choice of verified or unverified status and user options on how they interact with accounts in either category”.


Crucially for these amendments, we said:

“We recommend that the Code of Practice also sets out clear minimum standards to ensure identification processes used for verification protect people’s privacy—including from repressive regimes or those that outlaw homosexuality”.


We were very clear about the difference between stripping away anonymity and ensuring that verification was available where the user wanted to engage only with those who had verified themselves. Requiring platforms to allow users—

Online Safety Bill

Debate between Lord Clement-Jones and Baroness Fox of Buckley
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- View Speech - Hansard - - - Excerpts

My Lords, to go back not just to the age question, the noble Lord, Lord Allan of Hallam, reminded us that community-led moderation is not just Wikipedia. What I tried to hint at earlier is that that is one of the most interesting, democratic aspects of the online world, which we should protect.

We often boast that we are a self-regulating House and that that makes us somehow somewhat superior to up the road—we are all so mature because we self-regulate; people do behave badly but we decide. It is a lesson in democracy that you have a self-regulating House, and there are parts of the online world that self-regulate. Unless we think that the citizens of the UK are less civilised than Members of the House of Lords, which I would refute, we should say that it is positive that there are self-moderating, self-regulating online sites. If you can say something and people can object and have a discussion about it, and things can be taken down, to me that is the way we should deal with speech that is inappropriate or wrong. The bulk of these amendments—I cannot remember how many there are now—are right.

I was glad that the noble Lord, Lord Moylan, said he could not understand why this grouping had happened, which is what I said earlier. I had gone through a number of groupings thinking: “What is that doing there? Am I missing something? Why is that in that place?” I think we will come back to the age verification debate and discussion.

One thing to note is that one of the reasons organisations such as Wikipedia would be concerned about age verification—and they are—is anonymity. It is something we have to consider. What is going to happen to anonymity? It is so important for journalists, civil liberty activists and whistleblowers. Many Wikipedia editors are anonymised, maybe because they are politically editing sites on controversial issues. Imagine being a Wikipedia editor from Russia at the moment—you would not want to have to say who you are. We will come back to it but it is important to understand that Amendment 26, and those who are saying that we should look at the question of age verification, are not doing so because they do not care about children and are not interested in protecting them. However, the dilemmas of any age-gating or age verification for adult civil liberties have to be considered. We have to worry that, because of an emphasis on checking age, some websites will decide to sanitise what they allow to be published to make it suitable for children, just in case they come across it. Again, that will have a detrimental impact on adult access to all knowledge.

These will be controversial issues, and we will come back to them, but it is good to have started the discussion.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, this has been a very strange debate. It has been the tail end of the last session and a trailer for a much bigger debate coming down the track. It was very odd.

We do not want to see everything behind an age-gating barrier, so I agree with my noble friend. However, as the noble Baroness, Lady Kidron, reminded us, it is all about the risk profile, and that then leads to the kind of risk assessment that a platform is going to be required to carry out. There is a logic to the way that the Bill is going to operate.

When you look at Clause 11(3), you see that it is not disproportionate. It deals with “primary priority content”. This is not specified in the Bill but it is self-harm and pornography—major content that needs age-gating. Of course we need to have the principles for age assurance inserted into the Bill as well, and of course it will be subject to debate as we go forward.

There is technology to carry out age verification which is far more sophisticated than it ever was, so I very much look forward to that debate. We started that process in Part 3 of the Digital Economy Act. I was described as an internet villain for believing in age verification. I have not changed my view, but the debate will be very interesting. As regards the tail-end of the previous debate, of course we are sympathetic on these Benches to the Wikipedia case. As we said on the last group, I very much hope that we will find a way, whether it is in Schedule 1 or in another way, of making sure that Wikipedia is not affected overly by this—maybe the risk profile that is drawn up by Ofcom will make sure that Wikipedia is not unduly impacted.