Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Fox of Buckley
Main Page: Baroness Fox of Buckley (Non-affiliated - Life peer)Department Debates - View all Baroness Fox of Buckley's debates with the Department for Digital, Culture, Media & Sport
(1 year, 6 months ago)
Lords ChamberMy Lords, I rise to speak to Amendment 141 in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. Once again, I register the support of my noble friend Lady Campbell of Surbiton, who feels very strongly about this issue.
Of course, there is value in transparency online, but anonymity can be vital for certain groups of people, such as those suffering domestic abuse, those seeking help or advice on matters they wish to remain confidential, or those who face significant levels of hatred or prejudice because of who they are, how they live or what they believe in. Striking the right balance is essential, but it is equally important that everyone who wishes to verify their identity and access the additional protections that this affords can do so easily and effectively, and that this opportunity is open to all.
Clause 57 requires providers of category 1 services to offer users the option to verify their identity, but it is up to providers to decide what form of verification to offer. Under subsection (2) it can be “of any kind”, and it need not require any documentation. Under subsection (3), the terms of service must include a “clear and accessible” explanation of how the process works and what form of verification is available. However, this phrase in itself is open to interpretation: clear and accessible for one group may be unclear and inaccessible to another. Charities including Mencap are concerned that groups, such as people with a learning disability, could be locked out of using these tools.
It is also relevant that people with a learning disability are less likely to own forms of photographic ID such as passports or driving licences. Should a platform require this type of ID, large numbers of people with a learning disability would be denied access. In addition, providing an email or phone number and verifying this through an authentication process could be extremely challenging for those people who do not have the support in place to help them navigate this process. This further disadvantages groups of people who already suffer some of the most extensive restrictions in living their everyday lives.
Clause 58 places a duty on Ofcom to provide guidance to help providers comply with their duty, but this guidance is optional. Amendment 141 aims to strengthen Clause 58 by requiring Ofcom to set baseline principles and standards for the guidance. It would ensure, for example, that the guidance considers accessibility for disabled as well as vulnerable adults and aligns with relevant guidance on related matters such as age verification; it would ensure that verification processes are effective; and it would ensure that the interests of disabled users are covered in Ofcom’s pre-guidance consultation.
Online can be a lifeline for disabled and vulnerable adults, providing access to support, advice and communities of interest, and this is particularly important as services in the real world are diminishing, so we need to ensure that user-verification processes do not act as a further barrier to inclusion for people with protected characteristics, especially those with learning disabilities.
My Lords, the speech of the noble Baroness, Lady Buscombe, raised so many of the challenges that people face online, and I am sure that the masses who are watching parliamentlive as we speak, even if they are not in here, will recognise what she was talking about. Certainly, some of the animal rights activists can be a scourge, but I would not want to confine this to them, because I think trashing reputations online and false allegations have become the activists’ chosen weapon these days. One way that I describe cancel culture, as distinct from no-platforming, is that it takes the form of some terrible things being said about people online, a lot of trolling, things going viral and using the online world to lobby employers to get people sacked, and so on. It is a familiar story, and it can be incredibly unpleasant. The noble Baroness and those she described have my sympathy, but I disagree with her remedy.
An interesting thing is that a lot of those activities are not carried out by those who are anonymous. It is striking that a huge number of people with large accounts, well-known public figures with hundreds of thousands of followers—sometimes with more than a million—are prepared to do exactly what I described in plain sight, often to me. I have thought long and hard about this, because I really wanted to use this opportunity to read out a list and name and shame them, but I have decided that, when they go low, I will try to go at least a little higher. But subtweeting and twitchhunts are an issue, and one reason why we think we need an online harms Bill. As I said, I know that sometimes it can feel that if people are anonymous, they will say things that they would not say to your face or if you knew who they were, but I think it is more the distance of being online: even when you know who they are, they will say it to you or about you online, and then when you see them at the drinks reception, they scuttle away.
My main objection, however, to the amendment of the noble Baroness, Lady Buscombe, and the whole question of anonymity in general is that it treats anonymity as though it is inherently unsafe. There is a worry, more broadly on verification, about creating two tiers of users: those who are willing to be verified and those who are not, and those who are not somehow having a cloud of suspicion over them. There is a danger that undermining online anonymity in the UK could set a terrible precedent, likely to be emulated by authoritarian Governments in other jurisdictions, and that is something we must bear in mind.
On evidence, I was interested in Big Brother Watch’s report on some analysis by the New Statesman, which showed that there is little evidence to suggest that anonymity itself makes online discourse more febrile. It did an assessment involving tweets sent to parliamentarians since January 2021, and said there was
“little discernible difference in the nature or tone of the tweets that MPs received from anonymous or non-anonymous accounts. While 32 per cent of tweets from anonymous accounts were classed as angry according to the metric used by the New Statesman, so too were 30 per cent of tweets from accounts with full names attached.18 Similarly, 5.6 per cent of tweets from anonymous accounts included swear words, only slightly higher than the figure of 5.3 per cent for named accounts.”
It went through various metrics, but it said, “slightly higher, not much of a difference”. That is to be borne in mind: the evidence is not there.
In this whole debate, I have wanted to emphasise freedom as at least equal to, if not of greater value than, the safetyism of this Bill, but in this instance, I will say that, as the noble Baroness, Lady Bull, said, for some people anonymity is an important safety mechanism. It is a tool in the armoury of those who want to fight the powerful. It can be anyone: for young people experimenting with their sexuality and not out, it gives them the freedom to explore that. It can be, as was mentioned, survivors of sexual violence or domestic abuse. It is certainly crucial to the work of journalists, civil liberties activists and whistleblowers in the UK and around the world. Many of the Iranian women’s accounts are anonymous: they are not using their correct names. The same is true of Hong Kong activists; I could go on.
Anyway, in our concerns about the Bill, compulsory identity verification means being forced to share personal data, so there is a privacy issue for everyone, not just the heroic civil liberties people. In a way, it is your own business why you are anonymous—that is the point I am trying to make.
There are so many toxic issues at the moment that a lot of people cannot just come out. I know I often mention the gender-critical issue, but it is true that in many professions, you cannot give your real name or you will not just be socially ostracised but potentially jeopardise your career. I wrote an article during the 2016-17 days called Meet the Secret Brexiteers. It was true that many teachers and professors I knew who voted to leave had to be anonymous online or they would not have survived the cull.
Finally, I do not think that online anonymity or pseudonymity is a barrier to tracking down and prosecuting those who commit the kind of criminal activity on the internet described, creating some of the issues we are facing. Police reports show that between 2017-18, 96% of attempts by public authorities to identify anonymous users of social media accounts, their email addresses and telephone numbers, resulted in successful identification of the suspect in the investigation; in other words, the police already have a range of intrusive powers to track down individuals, should there be a criminal problem, and the Investigatory Powers Act 2016 allows the police to acquire communications data—for example, email addresses or the location of a device—from which alleged illegal anonymous activity is conducted and use it as evidence in court.
If it is not illegal but just unpleasant, I am afraid that is the world we live in. I would argue that what we require in febrile times such as these is not bans or setting the police on people but to set the example of civil discourse, have more speech and show that free speech is a way of conducting disagreement and argument without trashing reputations.
My Lords, what an unusually reticent group we have here for this group of amendments. I had never thought of the noble Baroness, Lady Fox, as being like Don Quixote, but she certainly seems to be tilting at windmills tonight.
I go back to the Joint Committee report, because what we said there is relevant. We said:
“Anonymous abuse online is a serious area of concern that the Bill needs to do more to address. The core safety objectives apply to anonymous accounts as much as identifiable ones. At the same time, anonymity and pseudonymity are crucial to online safety for marginalised groups, for whistleblowers, and for victims of domestic abuse and other forms of offline violence. Anonymity and pseudonymity themselves are not the problem and ending them would not be a proportionate response”.
We were very clear; the Government’s response on this was pretty clear too.
We said:
“The problems are a lack of traceability by law enforcement, the frictionless creation and disposal of accounts at scale, a lack of user control over the types of accounts they engage with and a failure of online platforms to deal comprehensively with abuse on their platforms”.
We said there should be:
“A requirement for the largest and highest risk platforms to offer the choice of verified or unverified status and user options on how they interact with accounts in either category”.
Crucially for these amendments, we said:
“We recommend that the Code of Practice also sets out clear minimum standards to ensure identification processes used for verification protect people’s privacy—including from repressive regimes or those that outlaw homosexuality”.
We were very clear about the difference between stripping away anonymity and ensuring that verification was available where the user wanted to engage only with those who had verified themselves. Requiring platforms to allow users—
My Lords, at the beginning of Committee, I promised that I would speak only twice, and this is the second time. I hope that noble Lords will forgive me if I stray from the group sometimes, but I will be as disciplined as I can. I will speak to Amendments 57 and 62, which the noble Baroness, Lady Featherstone, and I tabled. As others have said, the noble Baroness sends her apologies; sadly, she has fractured her spine, and I am sure we all wish her a speedy recovery. The noble Baroness, Lady Fox, has kindly added her name to these amendments.
As I have said, in a previous role, as a research director of a think tank—I refer noble Lords to my registered interests—I became interested in the phenomenon of unintended consequences. As an aside, it is sometimes known as the cobra effect, after an incident during the colonial rule of India, when a British administrator of Delhi devised a cunning plan to rid the city of dangerous snakes. It was simple: he would pay local residents a bounty for each cobra skin delivered. What could possibly go wrong? Never slow to exploit an opportunity, enterprising locals started to farm cobras as a way of earning extra cash. Eventually, the authorities grew wise to this, and the payments stopped. As a result, the locals realised that the snakes were now worthless and released them into the wild, leading to an increase, rather than a decrease, in the population of cobras.
As with the cobra effect, there have been many similar incidents of well-intentioned acts that have unintentionally made things worse. So, as we try to create a safer online space for our citizens, especially children and vulnerable adults, we should try to be as alert as we can to unintended consequences. An example is encrypted messages, which I discussed in a previous group. When we seek access to encrypted messages in the name of protecting children in this country, we should be aware that such technology could lead to dissidents living under totalitarian regimes in other countries being compromised or even murdered, with a devastating impact on their children.
We should also make sure that we do not unintentionally erode the fundamental rights and freedoms that underpin our democracy, and that so many people have struggled for over the centuries. I recognise that some noble Lords may say that that is applicable to other Bills, but I want to focus specifically on the implications for this Bill. In our haste to protect, we may create a digital environment and marketplace that stifles investment and freedom of expression, disproportionately impacting marginalised communities and cultivating an atmosphere of surveillance. The amendments the noble Baroness and I have tabled are designed to prevent such outcomes. They seek to strike a balance between regulating for a safer internet and preserving our democratic values. As many noble Lords have rightly said, all these issues will involve trade-offs; we may disagree, but I hope we will have had an informed debate, regardless of which side of the argument we are on.
We should explicitly outline the duties that service providers and regulators have with respect to these rights and freedoms. Amendment 57 focuses on safe- guarding specific fundamental rights and freedoms for users of regulated user-to-user services, including the protection of our most basic human rights. We believe that, by explicitly stating these duties, rather than hoping that they are somehow implied, we will create a more comprehensive framework for service providers to follow, ensuring that their safety policies and procedures do not undermine the essential rights of users, with specific reference to
“users with protected characteristics under the Equality Act 2010”.
Amendment 62 focuses on the role of Ofcom in mitigating risks to freedom of expression. I recognise that there are other amendments in this group on that issue. It is our responsibility to ensure that the providers of regulated user-to-user services are held accountable for their content moderation and recommender systems, to ensure they do not violate our freedoms.
I want this Bill to be a workable Bill. As I have previously said, I support the intention behind it to protect children and vulnerable adults, but as I have said many times, we should also be open about the trade-off between security and protection on the one hand, and freedom of expression on the other. My fear is that, without these amendments, we risk leaving our citizens vulnerable to the unintended consequences of overzealous content moderation, biased algorithms and opaque decision-making processes. We should shine a light on and bring transparency to our new processes, and perhaps help guide them by being explicit about those elements of freedom of speech we wish to preserve.
It is our duty to ensure that the Online Safety Bill not only protects our citizens from harm but safeguards the principles that form the foundation of a free and open society. With these amendments, we hope to transcend partisan divides and to fortify the essence of our democracy. I hope that we can work together to create an online environment that is safe, inclusive and respectful of the rights and freedoms that the people of this country cherish. I hope that other noble Lords will support these amendments, and, ever the optimist, that my noble friend the Minister will consider adopting them.
My Lords, it is a great pleasure to follow the noble Lord, Lord Kamall, who explained well why I put my name to the amendments. I extend my regards to the noble Baroness, Lady Featherstone; I was looking forward to hearing her remarks, and I hope that she is well.
I am interested in free speech; it is sort of my thing. I am interested in how we can achieve a balance and enhance the free speech rights of the citizens of this country through the Bill—it is what I have tried to do with the amendments I have supported—which I fear might be undermined by it.
I have a number of amendments in this group. Amendment 49 and the consequential Amendments 50 and 156 would require providers to include in their terms of service
“by what method content present on the service is to be identified as content of democratic importance”,
and bring Clause 13 in line with Clauses 14 and 15 by ensuring an enhanced focus on the democratic issue.
Amendment 53A would provide that notification is given
“to any user whose content has been removed or restricted”.
It is especially important that the nature of the restriction in place be made clear, evidenced and justified in the name of transparency and—a key point—that the user be informed of how to appeal such decisions.
Amendment 61 in my name calls for services to have
“proportionate systems, processes and policies designed to ensure that as great a weight is given to users’ right to freedom of expression ... as to safety when making decisions”
about whether to take down or restrict users access to the online world, and
“whether to take action against a user generating, uploading or sharing content”.
In other words, it is all about applying a more robust duty to category 1 service providers and emphasising the importance of protecting
“a wide diversity of political, social, religious and philosophical opinion”
online.
I give credit to the Government, in that Clause 18 constitutes an attempt by them in some way to balance the damage to individual rights to freedom of expression and privacy as a result of the Bill, but I worry that it is a weak duty. Unlike operational safety duties, which compel companies proactively to prevent or minimise so-called harm in the way we have discussed, there is no such attempt to insist that freedom of speech be given the same regard or importance. In fact, there are worries that the text of the Bill has downgraded speech and privacy rights, which the Open Rights Group says
“are considered little more than a contractual matter”.
There has certainly been a lot of mention of free speech in the debates we have had so far in Committee, yet I am not convinced that the Bill gives it enough credit, which is why I support the explicit reference to it by the noble Lord, Lord Kamall.
I have a lot of sympathy with the amendments of the noble Lord, Lord Stevenson, seeking to replace Clauses 13, 14, 15 and 18 with a single comprehensive duty, because in some ways we are scratching around. That made some sense to me and I would be very interested to hear more about how that might work. Clauses 13, 14, 15 and 18 state that service providers must have regard to the importance of protecting users’ rights to freedom of expression in relation to
“content of democratic importance ... publisher content ... journalistic content”.
The very existence of those clauses, and the fact that we even need those amendments, is an admission by the Government that elsewhere, free speech is a downgraded virtue. We need these carve-outs to protect these things, because the rest of the Bill threatens free speech, which has been my worry from the start.
My Amendment 49 is a response to the Bill’s focus on protecting “content of democratic importance”. I was delighted that this was included, and the noble Lord, Lord Stevenson of Balmacara, has raised a lot of the questions I was asking. I am concerned that it is rather vaguely drawn, and too narrow and technocratic—politics with a big “P”, rather than in the broader sense. There is a lot that I would consider democratically important that other people might see, especially given today’s discussion, as harmful or dangerous. Certainly, the definition should be as broad as possible, so my amendment seeks to write that down, saying that it should include
“political, social, religious and philosophical opinion”.
That is my attempt to broaden it out. It is not perfect, I am sure, but that is the intention.
I am also keen to understand why Clauses 14 and 15, which give special protection to news publisher and journalistic content, have enhanced provisions, including an expedited appeals process for the reinstatement of removed materials, but those duties are much weaker—they do not exist—in Clause 13, which deals with content of democratic importance. In my amendment, I have suggested that they are levelled up.