Online Safety Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberI support Amendment 44. I am pleased that, as part of the new triple shield, the Government have introduced Clause 12 on “User empowerment duties”, which allow users to protect themselves, not just from abusive posts from other users but from whole areas of content. In the Communications and Digital Committee’s inquiry, we had plenty of evidence from organisations representing minorities and people with special characteristics who are unable adequately to protect themselves from the hate they receive online. I am glad that subsections (10) to (12) recognise specific content and users with special characteristics who are targets of abuse and need to be able to protect themselves, but subsection (3) requests that these features should be
“designed to effectively … reduce the likelihood of the user encountering content”
they want to avoid. I am concerned that “effectively” will be interpreted subjectively by platforms in scope and that each will interpret it differently.
At the moment, it will not be possible for Ofcom to assess how thoroughly the platforms have been providing these empowerment tools of protection for users. If the features are to work, there must be an overview of how effective they are being and how well they are working. When the former Secretary of State, Michelle Donelan, was asked about this, she said that there was nothing in this clause to pin an assessment on. It seems to me that the lists in Clause 12 create plenty of criteria on which to hang an assessment.
The new duties in Clause 12 provide for control tools for users against very specific content that is abusive or incites hatred on the basis of race, ethnicity, religion, disability, sex, gender reassignment or sexual orientation. However, this list is not exhaustive. There will inevitably be areas of content for which users have not been given blocking tools, including pornography, violent material and other material that is subject to control in the offline world.
Not only will the present list for such tools need to be assessed for its thoroughness in allowing users to protect themselves from specific harms, but surely the types of harm from which they need to protect themselves will change over time. Ofcom will need regularly to assess where these harms are and make sure that service providers regularly update their content-blocking tools. Without such an assessment, it will be hard for Ofcom and civil society to understand what the upcoming concerns are with the tools.
The amendment would provide a transparency obligation, which would demand that service providers inform users of the risks present on the platform. Surely this is crucial when users are deciding what to protect themselves from.
The assessment should also look for unintended restrictions on freedom of expression created by the new tools. If the tools are overprotective, they could surely create a bubble and limit users’ access to information that they might find useful. For example, the user might want to block material about eating disorders, but the algorithm might interpret that to mean limiting the user’s access to content on healthy lifestyles or nutrition content. We are also told that the algorithms do not understand irony and humour. When the filters are used to stop content that is abusive or incites hatred on the basis of users’ particular characteristics, they might also remove artistic, humorous or satirical content.
Repeatedly, we are told that the internet creates echo chambers, where users read only like-minded opinions. These bubbles can create an atmosphere where freedom of expression is severely limited and democracy suffers. A freedom of expression element to the assessment would also, in these circumstances, be critical. We are told that the tech platforms often do not know what their algorithms do and, not surprisingly, they often evolve beyond their original intentions. Assessments on the tools demanded by Clause 12 need to be carefully investigated to ensure that they are keeping up to date with the trends of abuse on the internet but also for the unintended consequences they might create, curbing freedom of expression.
Throughout the Bill, there is a balancing act between freedom of expression and protection from abuse. The user empowerment tools are potentially very powerful, and neither the service providers, the regulators nor the Government know what their effects will be. It is beholden upon the Government to introduce an assessment to check regularly how the user empowerment duties are working; otherwise, how can they be updated, and how can Ofcom discover what content is being unintentionally controlled? I urge the Minister, in the name of common sense, to ensure that these powerful tools unleashed by the Bill will not be misused or become outdated in a fast-changing digital world.
My Lords, I thank the noble Lord, Lord Moylan, for his words—I thought I was experiencing time travel there—and am sympathetic to many of the issues that he has raised, although I think that some of the other amendments in the group tackle those issues in a slightly different way.
I support Amendments 44 and 158 in the name of the right reverend Prelate the Bishop of Oxford. Requiring a post-rollout assessment to ensure that the triple shield acts as we are told it will seems to be a classic part of any regulatory regime that is fit for purpose: it needs to assess whether the system is indeed working. The triple shield is an entirely new concept, and none of the burgeoning regulatory systems around the world is taking this approach, so I hope that both the Government and Ofcom welcome this very targeted and important addition to the Bill.
I will also say a few words about Amendments 154 and 218. It seems to me that, in moving away from legal but harmful—which as a member of the pre-legislative committee I supported, under certain conditionality that has not been met, but none the less I did support it—not enough time and thought have been given to the implications of that. I do not understand, and would be grateful to the Minister if he could help me understand, how Ofcom is to determine whether a company has met its own terms and conditions—and by any means, not only by the means of a risk assessment.
I want to make a point that the noble Baroness, Lady Healy, made the other day—but I want to make it again. Taking legal but harmful out and having no assessment of whether a company has met its general safety duties leaves the child safety duties as an island. They used to be something that was added on to a general system of safety; now they are the first and only port of call. Again, because of the way that legal but harmful fell out of the Bill, I am not sure whether we have totally understood how the child risk assessments sit without a generally cleaned up or risk-assessed digital environment.
Finally, I will speak in support of Amendment 160, which would have Ofcom say what “adequate and appropriate” terms are. To a large degree, that is my approach to the problem that the noble Lord, Lord Moylan, spoke about: let Parliament and the regulator determine what we want to see—as was said on the data protection system, that is how it is—and let us have minimum standards that we can rightly expect, based on UK law, as the noble Lord suggested.
I am not against the triple shield per se, but it radically replaced an entire regime of assessment, enforcement and review. I think that some of the provisions in this group really beg the Government’s attention, in order to make sure that there are no gaping holes in the regime.
My Lords, I will speak to Amendments 44 and 158 in the name of the right reverend Prelate the Bishop of Oxford. I also note my support for the amendments in the name of the noble Lord, Lord Stevenson of Balmacara, to ensure the minimum standard for a platform’s terms of service. My noble friend Lord Moylan has just given an excellent speech on the reasons why these amendments should be considered.
I am aware that the next group of amendments relates to the so-called user empowerment tools, so it seems slightly bizarre to be speaking to Amendment 44, which seeks to ensure that these user empowerment tools actually work as the Government hope they will, and Amendment 158, which seeks to risk assess whether providers’ terms of service duties do what they say and report this to Ofcom. Now that the Government have watered down the clauses that deal with protection for adults, like other noble Lords, I am not necessarily against the Government’s replacement—the triple shield—but I believe that it needs a little tightening up to ensure that it works properly. These amendments seem a reasonable way of doing just that. They would ensure greater protection for adults without impinging on others’ freedom of expression.
The triple shield relies heavily on companies’ enforcement of terms of service and other vaguely worded duties, as the noble Viscount mentioned, that user empowerment tools need to be “easily accessible” and “effective”—whatever that means. Unlike with other duties in the Bill, such as those on illegal content and children’s duties, there is no mechanism to assess whether these new measures are working; whether the way companies are carrying out these duties is in accordance with the criteria set out; and whether they are indeed infringing freedom of expression. Risk assessments are vital to doing just that, because they are vital to understanding the environment in which services operate. They can reduce bureaucracy by allowing companies to rule out risks which are not relevant to them, and they can increase user safety by revealing new risks, thereby enabling the future-proofing of a regime. Can the Minister give us an answer today as to why risk assessment duties on these two strands of the triple shield—terms of service and user empowerment tools—were removed? If freedom of speech played a part in this, perhaps he could elaborate why he thinks undertaking a risk assessment is in any way a threat.
Without these amendments, the Bill cannot be said to be a complete risk management regime. Companies will, in effect, be marking their own homework when designing their terms of service and putting their finger in the air when it comes to user empowerment tools. There will be no requirement for them to explain either to Ofcom or indeed to service users the true nature of the harms that occur on their service, nor the rationale behind any decisions they might make in these two fundamental parts of their service.
Since the Government are relying so heavily on their triple shield to ensure protection for adults, to me, not reviewing two of the three strands that make up the triple shield seems like fashioning a three-legged stool with completely uneven legs: a stool that will not stand up to the slightest pressure when used. Therefore, I urge the Minister to look again and consider reinstating these protections in the Bill.
My Lords, I contribute to this debate on the basis of my interests as laid out in the register: as chief executive of Cerebral Palsy Scotland; my work with the Scottish Government on people with neurological conditions; and as a trustee of the Neurological Alliance of Scotland. It is an honour to follow the right reverend Prelate, whose point about the inequality people experience in the online world is well made. I want to be clear that when I talk about ensuring online protection for people with disabilities, I do not assume that all adults with disabilities are unable to protect themselves. As the right reverend Prelate and the noble Lord, Lord Griffiths of Burry Port, pointed out, survey after survey demonstrates how offline vulnerabilities translate into the online world, and Ofcom’s own evidence suggests that people with physical disabilities, learning disabilities, autism, mental health issues and others can be classed as being especially vulnerable online.
The Government recognise that vulnerable groups are at greater risk online, because in its previous incarnations, this Bill included greater protection for such groups. We spoke in a previous debate about the removal of the “legal but harmful” provisions and the imposition of the triple shield. The question remains from that debate: does the triple shield provide sufficient protection for these vulnerable groups?
As I have said previously this afternoon, user empowerment tools are the third leg of the triple shield, but they put all the onus on users and no responsibility on the platforms to prevent individuals’ exposure to harm. Amendments 36, 37 and 38A, in the name of the noble Lord, Lord Clement-Jones, seek simply to make the default setting for the proposed user empowerment tools to be “on”. I do not pretend to understand how, technically, this will happen, but it clearly can, because the Bill requires platforms to ensure that this is the default position to ensure protection for children. The default position in those amendments protects all vulnerable people, and that is why I support them—unlike, I fear, Amendment 34 from my noble friend Lady Morgan, which lists specific categories of vulnerable adults. I would prefer that all vulnerable people be protected from being exposed to harm in the first place.
Nobody’s freedom of expression is affected in any way by this default setting, but the overall impact on vulnerable individuals in the online environment would, I assure your Lordships, be significant. Nobody’s ability to explore the internet or to go into those strange rooms at the back of bookshops that the noble Baroness, Lady Fox, was talking about would be curtailed. The Government have already stated that individuals will have the capacity to seek out these tools and turn them on and off, and that they must be easily accessible. So individuals with capacity will be able to find the settings and set them to explore whatever legal content they choose.
However, is it not our duty to remember those who do not have capacity? What about adults with learning difficulties and people at a point of crisis—the noble Baroness, Lady Parminter, movingly spoke about people with eating disorders—who might not be able to turn to those tools due to their affected mental state, or who may not realise that what they are seeing is intended to manipulate? Protecting those users from encountering such content in the first place surely tips the balance in favour of turning the tools on by default.
I am very sad that the noble Baroness, Lady Campbell of Surbiton, cannot be here, because her contribution to this debate would be powerful. But, from her enormous experience of work with disabled people, this is her top priority for the Bill.
In preparing to speak to these amendments, I looked back to the inquiry in the other place into online abuse and the experience of disabled people that was prompted by Katie Price’s petition after the shocking abuse directed at her disabled son Harvey. In April 2019 the Government responded to that inquiry by saying that they were
“aware of the disproportionate abuse experienced by disabled people online and the damage such abuse can have on people’s lives, career and health”—
and the Government pledged to act.
The internet is a really important place for disabled people, and I urge the Government to ensure that it remains a safe place for all of us and to accept these amendments that would ensure the default settings are set to on.
My Lords, I rise to support the amendments in the name of the noble Baroness, Lady Morgan. I do so somewhat reluctantly, not because I disagree with anything that she said but because I would not necessarily start from here. I want to briefly say three very quick things about that and then move on to Amendments 42 and 45, which are also in this group.
We already have default settings, and we are pretending that this is a zero-sum game. The default settings at the moment are profiling us, filtering us and rewarding us; and, as the right reverend Prelate said in his immensely powerful speech, we are not starting at zero. So I do share the concerns of the noble Baroness, Lady Fox, about who gets to choose—some of us on this side of the debate are saying, “Can we define who gets to choose? Can Parliament choose? Can Ofcom choose? Can we not leave this in the hands of tech companies?” So on that I fully agree. But we do have default settings already, and this is a question of looking at some of the features as well as the content. It is a weakness of the Government’s argument that it keeps coming back to the content rather than the features, which are the main driver of what we see.
The second thing I want to say—this is where I am anxious about the triple shield—is: does not knowing you are being abused mean that you are not abused? I say that as someone with some considerable personal abuse. I have my filter on and I am not on social media, but my children, my colleagues and some of the people I work with around the world do see what is said about me—it is a reputational thing, and for some of them it is a hurtful thing, and that is why I am reluctant in my support. However, I do agree with all the speakers who have said that our duty is to start with those people who are most vulnerable.
I want to mention the words of one of the 5Rights advisers—a 17 year-old girl—who, when invited to identify changes and redesign the internet, said, “Couldn’t we do all the kind things first and gradually get to the horrible ones?” I think that this could be a model for us in this Chamber. So, I do support the noble Baroness.
I want to move briefly to Amendment 42, which would see an arbitrary list of protected characteristics replaced by the Equality Act 2010. This has a lot to do with a previous discussion we had about human rights, and I want to say urgently to the Minister that the offer of the Online Safety Bill is not to downgrade human rights, children’s rights and UK law, but rather to bring forward a smart and comprehensive regime to hold companies accountable for human rights, children’s rights and UK law. We do not want to have a little list of some of our children’s rights or of some of our legislation; we would like our legislation and our rights embedded in the Bill.
I have to speak for Amendment 45. I express my gratitude to the noble Lord, Lord Stevenson, for tabling it. It would require Ofcom, six months after the event, to ask whether children need these user empowerment tools. It is hugely important. I remind the Committee that children have not only rights but an evolving capacity to be out there in the world. As I said earlier, the children’s safety duties have a cliff-edge feel to them. As children go out into the world on the cusp of adulthood, maybe they would like to have some of these user empowerment tools.
My Lords, the noble Baroness, Lady Kidron, said words to the effect that perhaps we should begin by having particular regard for certain vulnerabilities, but we are dealing with primary legislation and this really concerns me. Lists such as in Clause 12 are really dangerous. It is not a great way to write law. We could be with this law for a long time.
I took the Communications Act 2003 through for Her Majesty’s Opposition, and we were doing our absolute best to future-proof the legislation. There was no mention of the internet in that piece of legislation. With great respect to the noble Lord, Lord McNally, with whom I sparred in those days, in was not that Act that introduced Ofcom but a separate Act. The internet was not even mentioned until the late Earl of Northesk introduced an amendment with the word “internet” to talk about the investigative powers Act.
The reality is that we already had Facebook, and tremendous damage being done through it to people such as my daughter. Noble Lords will remember that in the early days it was Oxford, Cambridge, Yale and Harvard; that is how it all began. It was an amazing thing, and we could not foresee what would happen but there was a real attempt to future-proof. If you start having lists such as in Clause 12, you cannot just add on or change. Cultural mores change. This list, which looks great in 2023, might look really odd in about 2027. Different groups will have emerged and say, “Well, what about me, what about me?”.
I entirely agree with the noble Baroness, Lady Fox. Who will be the decider of what is right, what is rude or what is abusive? I have real concerns with this. The Government have had several years to get this right. I say that with great respect to my noble friend the Minister, but we will have to think about these issues a little further. The design of the technology around all this is what we should be imposing on the tech companies. I was on the Communications and Digital Committee in 2020 when that was a key plank of our report, following the inquiry that we carried out and prior to the Joint Committee, then looking at this issue of “legal but harmful”, et cetera. I am glad that was dropped because—I know that I should not say this—when I asked a civil servant what was meant by “harmful”, he said, “Well, it might upset people”.
It is a very subjective thing. This is difficult for the Government. We must do all we can to support the Government in trying to find the right solutions, but I am sorry to say that I am a lawyer—a barrister—and I worry. We are trying to make things right but, remember, once it is there in an Act, it is there. People will use that as a tool. In 2002, at New Scotland Yard, I was introduced to an incredible website about 65 ways to become a good paedophile. Where does that fit in Clause 12? I have not quite worked that out. Is it sex? What is it? We have to be really careful. I would prefer having no list and making it more general, relying on the system to allow us to opt in.
I support my noble friend Lady Morgan’s amendment on this, which would make it easier for people to say, “Well, that’s fine”, but would not exclude people. What happens if you do not fit within Clause 12? Do you then just have to suck it up? That is not a very House of Lords expression, but I am sure that noble Lords will relate to it.
We have to go with care. I will say a little more on the next group of amendments, on anonymity. It is really hard, but what the Government are proposing is not quite there yet.
That seemed to be provoked by me saying that we must look after the vulnerable, but I am suggesting that we use UK law and the rights that are already established. Is that not better than having a small list of individual items?
I agree. The small list of individual items is the danger.
Does the Minister therefore think that the Government condone the current system, where we are inundated algorithmically with material that we do not want? Are the Government condoning that behaviour, in the way that he is saying they would condone a safety measure?
We will come to talk about algorithms and their risks later on. There is an important balance to strike here that we have debated, rightly, in this group. I remind noble Lords that there are a range of measures that providers can put in place—
But as I think the noble Baroness understands from that reference, this is a definition already in statute, and with which Parliament and the courts are already engaged.
The Bill’s overarching freedom of expression duties also apply to Clause 12. Subsections (4) to (7) of Clause 18 stipulate that category 1 service providers are required to assess the impact on free expression from their safety policies, including the user empowerment features. This is in addition to the duties in Clause 18(2), which requires all user-to-user services to have particular regard to the importance of protecting freedom of expression when complying with their duties. The noble Baroness’s Amendment 283ZA would require category 1 providers to make judgments on user empowerment content to a similar standard required for illegal content. That would be disproportionate. Clause 170 already specifies how providers must make judgments about whether content is of a particular kind, and therefore in scope of the user empowerment duties. This includes making their judgment based on “all relevant information”. As such, the Bill already ensures that the user empowerment content features will be applied in a proportionate way that will not undermine free speech or hinder legitimate debate online.
Amendment 45, tabled by the noble Lord, Lord Stevenson of Balmacara, would require the Secretary of State to lay a Statement before Parliament outlining whether any of the user empowerment duties should be applied to children. I recognise the significant interest that noble Lords have in applying the Clause 12 duties to children. The Bill already places comprehensive requirements on Part 3 services which children are likely to access. This includes undertaking regular risk assessments of such services, protecting children from harmful content and activity, and putting in place age-appropriate protections. If there is a risk that children will encounter harm, such as self-harm content or through unknown or unverified users contacting them, service providers will need to put in place age- appropriate safety measures. Applying the user empowerment duties for child users runs counter to the Bill’s child safety objectives and may weaken the protections for children—for instance, by giving children an option to see content which is harmful to them or to engage with unknown, unverified users. While we recognise the concerns in this area, for the reasons I have set out, the Government do not agree with the need for this amendment.
I will resist the challenge of the noble Lord, Lord Knight, to talk about bots because I look forward to returning to that in discussing the amendments on future-proofing. With that, I invite noble Lords—
I noted the points made about the way information is pushed and, in particular, the speech of the right reverend Prelate. Nothing in the Government’s response has really dealt with that concern. Can the Minister say a few words about not the content but the way in which users are enveloped? On the idea that companies always act because they have a commercial imperative not to expose users to harmful material, actually, they have a commercial imperative to spread material and engage users. It is well recorded that a lot of that is in fact harmful material. Can the Minister speak a little more about the features rather than the content?
We will discuss this when it comes to the definition of content in the Bill, which covers features. I was struck by the speech by the right reverend Prelate about the difference between what people encounter online, and the analogy used by the noble Baroness, Lady Fox, about a bookshop. Social media is of a different scale and has different features which make that analogy not a clean or easy one. We will debate in other groups the accumulated threat of features such as algorithms, if the noble Baroness, Lady Kidron, will allow me to go into greater detail then, but I certainly take the points made by both the right reverend Prelate and the noble Baroness, Lady Fox, in their contributions.