(1 year, 6 months ago)
Lords ChamberMy Lords, I will speak to Amendment 155 in my name, and I am grateful for the support of the noble Baroness, Lady Fox of Buckley, and my noble friend Lord Strathcarron. Some of my remarks in Committee last week did not go down terribly well with Members and, in retrospect, I realise that that was because I was the only Member of the Committee that day who did not take the opportunity to congratulate the noble Baroness, Lady Kidron, on her birthday. So at this very late stage—a week later —I make good that deficiency and hope that, in doing so, I will get a more jocular and welcoming hearing than I did last week. I will speak in a similar vein, though on a different topic and part of the Bill.
This amendment relates to Clause 65, which has 12 subsections. I regard the first subsection as relatively uncontroversial; it imposes a duty on all service providers. The effect of this amendment would be to remove all the remaining subsections, which fall particularly on category 1 providers. What Clause 65 does, in brief, is to make it a statutory obligation for category 1 providers to live up to their terms of service. Although it does not seek to specify what the terms of service must be, it does, in some ways, specify how they should be operated once they have been written—I regard that as very odd, and will come back to the reason why.
I say at the outset that I understand the motivation behind this section of the Bill. It addresses the understandable feeling that if a service provider of any sort says that they have terms of service which mean that, should there be complaints, they will be dealt with in a certain way and to a certain timetable and that you will get a response by a certain time, or if they say that they will remove certain material, that they should do what they say they will do in the terms of service. I understand what the clause is trying to do —to oblige service providers to live up to their terms of service—but this is a very dangerous approach.
First of all, while terms of service are a civil contract between the provider and the user, they are not an equal contract, as we all know. They are written for the commercial benefit and advantage of the companies that write them—not just in the internet world; this is generally true—and they are written on a take it or leave it basis. Of course, they cannot be egregiously disadvantageous to the customer or else the customer would not sign up to them; none the less, they are drafted with the commercial and legal advantage of the companies in question. Terms of service can be extreme. Noble Lords may be aware that, if you have a bank account, the terms of service that your bank has, in effect, imposed on you almost certainly include a right for the bank to close your account at any time it wishes and to give no reason for doing so. I regard that as an extreme terms of service provision, but it is common. They are not written as equal contracts between consumers and service providers.
Why, therefore, would we want to set terms of service in statute? That is what this clause does: to make them enforceable by a regulator under statute. Moreover, why would we want to do it when the providers we are discussing will have, in practice, almost certainly drafted their terms of service under the provisions of a foreign legal system, which we are then asking our regulator to ensure is enforced? My objection is not to try to find a way of requiring providers to live up to the terms of service they publish—indeed, the normal process for doing so would be through a civil claim; instead, I object to the method of doing so set out in this section of the Bill.
We do not use this method with other terms of service features. For example, we do not have a regulator who enforces terms of service on data protection; we have a law that says what companies must do to protect data, and then we expect them to draft terms of service, and to conduct themselves in other ways, that are compatible with that law. We do not make the terms of services themselves enforceable through statute and regulation, yet that is what this Bill does.
When we look at the terms of service of the big providers on the internet—the sorts of people we have in mind for the scope of the Bill—we find that they give themselves, in their terms of service, vast powers to remove a wide range of material. Much of that would fall—I say this without wanting to be controversial —into the category of “legal but harmful”, which in some ways this clause is reviving through the back door.
Of course, what could be “harmful” is extremely wide, because it will have no statutory bounds: it will be whatever Twitter or Google say they will remove in their terms of service. We have no control over what they say in their terms of service; we do not purport to seek such control in the Bill or in this clause. Twitter policy, for example, is to take down material that offends protected characteristics such as “gender” and “gender identity”. Now, those are not protected characteristics in the UK; the relevant protected characteristics in the Equality Act are “sex” and “gender reassignment”. So this is not enforcing our law; our regulator will be enforcing a foreign law, even though it is not the law we have chosen to adopt here.
My Lords, my noble friend has explained clearly how terms of service would normally work, which is that, as I said myself, a business might write its own terms of service to its own advantage but it cannot do so too egregiously or it will lose customers, and businesses may aim themselves at different customers. All this is part of normal commercial life, and that is understood. What my noble friend has not really addressed is the question of why uniquely and specifically in this case, especially given the egregious history of censorship by Silicon Valley, he has chosen to put that into statute rather than leave it as a commercial arrangement, and to make it enforceable by Ofcom. For example, when my right honourable friend David Davis was removed from YouTube for his remarks about Covid passes, it would have been Ofcom’s obligation not to vindicate his right to free speech but to cheer on YouTube and say how well it had done for its terms of service.
Our right honourable friend’s content was reuploaded. This makes the point that the problem at the moment is the opacity of these terms and conditions; what platforms say they do and what they do does not always align. The Bill makes sure that users can hold them to account for the terms of service that they publish, so that people can know what to expect on platforms and have some form of redress when their experience does not match their expectations.
I was coming on to say a bit more about that after making some points about foreign jurisdictions and my noble friend’s Amendment 155. As I say, parts or versions of the service that are used in foreign jurisdictions but not in the UK are not covered by the duties in Clause 65. As such, the Bill does not require a provider to have systems and processes designed to enforce any terms of service not applicable in the UK.
In addition, the duties do not give powers to Ofcom to enforce a provider’s terms of service directly. Ofcom’s role will be focused on ensuring that platforms have systems and processes in place to enforce their own terms of service consistently rather than assessing individual pieces of content.
Requiring providers to set terms of service for specific types of content suggests that the Government view that type of content as harmful or risky. That would encourage providers to prohibit such content, which of course would have a negative impact on freedom of expression, which I am sure is not what my noble friend wants to see. Freedom of expression is essential to a democratic society. Throughout the passage of the Bill, the Government have always committed to ensuring that people can speak freely online. We are not in the business of indirectly telling companies what legal content they can and cannot allow online. Instead, the approach that we have taken will ensure that platforms are transparent and accountable to their users about what they will and will not allow on their services.
Clause 65 recognises that companies, as private entities, have the right to remove content that is legal from their services if they choose to do so. To prevent them doing so, by requiring them to balance this against other priorities, would have perverse consequences for their freedom of action and expression. It is right that people should know what to expect on platforms and that they are able to hold platforms to account when that does not happen. On that basis, I invite the noble Lords who have amendments in this group not to press them.
My Lords, I am going to endeavour to be relatively brief. I rise to move Amendment 38 and to speak to Amendments 39, 139 and 140 in this group, which are in my name. All are supported by my noble friend Lord Vaizey of Didcot, to whom I am grateful.
Amendments 38 and 39 relate to Clause 12. They remove subsections (6) and (7) from the Bill; that is, the duty to filter out non-verified users. Noble Lords will understand that this is different from the debate we have just had, which was about content. This is about users and verification of the users, rather than the harm or otherwise of the content. I am sure I did not need to say that, but perhaps it helps to clarify my own thinking to do so. Amendments 139 and 140 are essentially consequential but make it clear that my amendments do not prohibit category 1 services from offering this facility. They make it a choice, not a duty.
I want to make one point only in relation to these amendments. It has been well said elsewhere that this is a Twitter-shaped Bill, but it is trying to apply itself to a much broader part of the internet than Twitter, or things like it. In particular, community-led services like Wikipedia, to which I have made reference before, operate on a totally different basis. The Bill seeks to create a facility whereby members of the public like you and me can, first, say that we want the provider to offer a facility for verifying those who might use their service, and secondly, for us, as members of the public, to be able to say we want to see material from only those verified accounts. However, the contributors to Wikipedia are not verified, because Wikipedia has no system to verify them, and therefore it would be impossible for Wikipedia, as a category 1 service, to be able to comply with this condition on its current model, which is a non-commercial, non-profit one, as noble Lords know from previous comments. It would not be able to operate this clause; it would have to say that either it is going to require every contributing editor to Wikipedia to be verified first in order to do so, which would be extremely onerous; or it would have to make it optional, which would be difficult, but lead to the bizarre conclusion that you could open an article on Wikipedia and find that some of its words or sentences were blocked, and you could not read them because those amendments to the article had been made by someone who had not been verified. Of course, putting a system in place to allow that absurd outcome would itself be an impossible burden on Wikipedia.
My complaint—as always, in a sense—about the Bill is that it misfires. Every time you touch it, it misfires in some way because it has not been properly thought through. It is perhaps trying to do too much across too broad a front, when it is clear that the concern of the Committee is much narrower than trying to bowdlerize Wikipedia articles. That is not the objective of anybody here, but it is what the Bill is tending to do.
I will conclude by saying—I invite my noble friend to comment on this if he wishes; I think he will have to comment on it at some stage—that in reply to an earlier Committee debate, I heard him say somewhat tentatively that he did not think that Wikipedia would qualify as a category 1 service. I am not an advocate for Wikipedia; I am just a user. But we need to know what the Government’s view is on the question of Wikipedia and services like it. Wikipedia is the only community-led service, I think, of such a scale that it would potentially qualify as category 1 because of its size and reach.
If the Minister’s view is that Wikipedia would not qualify as a category 1 service—in which case, my amendments are irrelevant because it would not be caught by this clause—then he needs to say so. More than that, he needs to say on what basis it would not qualify as a category 1 service. Would it be on the face of the Bill? If not, would it be in the directions given by the Secretary of State to the regulator? Would it be a question of the regulator deciding whether it was a category 1 service? Obviously, if you are trying to run an operation such as Wikipedia with a future, you need to know which of those things it is. Do you have legal security against being determined as a category 1 provider or is it merely at the whim—that is not the right word; the decision—of the regulator in circumstances that may legitimately change? The regulator may have a good or bad reason for changing that determination later. You cannot run a business not knowing these things.
I put it to noble Lords that this clause needs very careful thinking through. If it is to apply to community-led services such as Wikipedia, it is an absurdity. If it is not to apply to them because what I think I heard my noble friend say pertains and they are not, in his view, a category 1 service, why are they not a category 1 service? What security do they have in knowing either way? I beg to move.
My Lords, I will speak to Amendment 106 in my name and the names of my noble and learned friend Lord Garnier and the noble Lord, Lord Moore of Etchingham. This is one of five amendments focused on the need to address the issue of activist-motivated online bullying and harassment and thereby better safeguard the mental health and general well-being of potential victims.
Schedule 4, which defines Ofcom’s objectives in setting out codes of practice for regulated user-to-user services, should be extended to require the regulator to consider the protection of individuals from communications offences committed by anonymous users. The Government clearly recognise that there is a threat of abuse from anonymous accounts and have taken steps in the Bill to address that, but we are concerned that their approach is insufficient and may be counterproductive.
I will explain. The Government’s approach is to require large social media platforms to make provision for users to have their identity verified, and to have the option of turning off the ability to see content shared by accounts whose owners have not done this. However, all this would mean is that people could not see abuse being levelled at them. It would not stop the abuse happening. Crucially, it would not stop other people seeing it, or the damage to his or her reputation or business that the victim may suffer as a result. If I am a victim of online bullying and harassment, I do not want to see it, but I do not want it to be happening at all. The only means I have of stopping it is to report it to the platform and then hope that it takes the right action. Worse still, if I have turned off the ability to see content posted by unverified—that is, anonymous—accounts, I will not be able to complain to the platform as I will not have seen it. It is only when my business goes bust or I am shunned in the street that I realise that something is wrong.
The approach of the Bill seems to be that, for the innocent victim—who may, for example, breed livestock for consumption—it is up that breeder to be proactive to correct harm already done by someone who does not approve of eating meat. This is making a nonsense of the law. This is not how we make laws in this country —until now, it seems. Practically speaking, the worst that is likely to happen is that the platform might ban their account. However, if their victims have had no opportunity to read the abuse or report it, even that fairly low-impact sanction could not be levelled against them. In short, the Bill’s current approach, I am sorry to say, would increase the sense of impunity, not lessen it.
One could argue that, if a potential abuser believes that their victim will not read their abuse, they will not bother issuing it. Unfortunately, this misunderstands the psyche of the online troll. Many of them are content to howl into the void, satisfied that other people who have not turned on the option to filter out content from unverified accounts will still be able to read it. The troll’s objective of harming the victim may be partially fulfilled as a result.
There is also the question of how much uptake there will be of the option to verify one’s identity, and numerous questions about the factors that this will depend on. Will it be attractive? Will there be a cost? How quick and efficient will the process be? Will platforms have the capacity to implement it at scale? Will it have to be done separately for every platform?
If uptake of verification is low, most people simply will not use the option to filter content of unverified accounts, even if it means that they remain more susceptible to abuse, since they would be cutting themselves off from most of their users. Clearly, that is not an option for anyone using social media for any promotional purpose. Even those who use it for purely social reasons will find that they have friends who do not want to be verified. Fundamentally, people use social media because other people use it. Carving oneself off from most of them defeats the purpose of the exercise.
It is not clear what specific measures the Bill could take to address the issue. Conceivably, it could simply ban online platforms from maintaining user accounts whose owners have not had their identities verified. However, this would be truly draconian and most likely lead to major platforms exiting the UK market, as the noble Baroness, Lady Fox, has rightly argued in respect of other possible measures. It would also be unenforceable, since users could simply turn on a VPN, pretend to be from some other country where the rules do not apply and register an account as though they were in that country.
There are numerous underlying issues that the Bill recognises as problems but does not attempt to prescribe solutions for. Its general approach is to delegate responsibility to Ofcom to frame its codes of practice for operators to follow in order to effectively tackle these problems. Specifically, it sets out a list of objectives that Ofcom, in drawing up its codes of practice, will be expected to meet. The protection of users from abuse, specifically by unverified or anonymous users, would seem to be an ideal candidate for inclusion in this list of amendments. If required to do so, Ofcom could study the issue closely and develop more effective solutions over time.
I was pleased to see, in last week’s Telegraph, an article that gave an all too common example of where the livelihood of a chef running a pub in Cornwall has suffered what amounts to vicious abuse online from a vegan who obviously does not approve of the menu, and who is damaging the business’s reputation and putting the chef’s livelihood at risk. This is just one tiny example, if I can put it that way, of the many thousands that are happening all the time. Some 584 readers left comments, and just about everyone wrote in support of the need to do something to support that chef and tackle this vicious abuse.
I return to a point I made in a previous debate: livelihoods, which we are deeply concerned about, are at stake here. I am talking not about big business but about individuals and small and family businesses that are suffering—beyond abuse—loss of livelihood, financial harm and/or reputational damage to business, and the knock-on effects of that.
(1 year, 6 months ago)
Lords ChamberMy Lords, the range of the amendments in this group indicates the importance of the Government’s approach to user verification and non-verified user duties. The way these duties have been designed seeks to strike a careful balance between empowering adults while safeguarding privacy and anonymity.
Amendments 38, 39, 139 and 140 have been tabled by my noble friend Lord Moylan. Amendments 38 and 39 seek to remove subsections (6) and (7) of the non-verified users’ duties. These place a duty on category 1 platforms to give adult users the option of preventing non-verified users interacting with their content, reducing the likelihood that a user sees content from non-verified users. I want to be clear that these duties do not require the removal of legal content from a service and do not impinge on free speech.
In addition, there are already existing duties in the Bill to safeguard legitimate online debate. For example, category 1 services will be required to assess the impact on free expression of their safety policies, including the impact of their user empowerment tools. Removing subsections (6) and (7) of Clause 12 would undermine the Bill’s protection for adult users of category 1 services, especially the most vulnerable. It would be entirely at the service provider’s discretion to offer users the ability to minimise their exposure to anonymous and abusive users, sometimes known as trolls. In addition, instead of mandating that users verify their identity, the Bill gives adults the choice. On that basis, I am confident that the Bill already achieves the effect of Amendment 139.
Amendment 140 seeks to reduce the amount of personal data transacted as part of the verification process. Under subsection (3) of Clause 57, however, providers will be required to explain in their terms of service how the verification process works, empowering users to make an informed choice about whether they wish to verify their identity. In addition, the Bill does not alter the UK’s existing data protection laws, which provide people with specific rights and protections in relation to the processing of their personal data. Ofcom’s guidance in this area will reflect existing laws, ensuring that users’ data is protected where personal data is processed. I hope my noble friend will therefore be reassured that these duties reaffirm the concept of choice and uphold the importance of protecting personal data.
While I am speaking to the questions raised by my noble friend, I turn to those he asked about Wikipedia. I have nothing further to add to the comments I made previously, not least that it is impossible to pre-empt the assessments that will be made of which services fall into which category. Of course, assessments will be made at the time, based on what the services do at the time of the assessment, so if he will forgive me, I will not be drawn on particular services.
To speak in more general terms, category 1 services are those with the largest reach and the greatest influence over public discourse. The Bill sets out a clear process for determining category 1 providers, based on thresholds set by the Secretary of State in secondary legislation following advice from Ofcom. That is to ensure that the process is objective and evidence based. To deliver this advice, Ofcom will undertake research into the relationship between how quickly, easily and widely user-generated content is disseminated by that service, the number of users and functionalities it has and other relevant characteristics and factors.
Will my noble friend at least confirm what he said previously: namely, that it is the Government’s view—or at least his view—that Wikipedia will not qualify as a category 1 service? Those were the words I heard him use at the Dispatch Box.
That is my view, on the current state of play, but I cannot pre-empt an assessment made at a point in the future, particularly if services change. I stand by what I said previously, but I hope my noble friend will understand if I do not elaborate further on this, at the risk of undermining the reassurance I might have given him previously.
Amendments 40, 41, 141 and 303 have been tabled by the noble Lord, Lord Stevenson of Balmacara, and, as noble Lords have noted, I have added my name to Amendment 40. I am pleased to say that the Government are content to accept it. The noble Baroness, Lady Merron, should not minimise this, because it involves splitting an infinitive, which I am loath to do. If this is a statement of intent, I have let that one go, in the spirit of consensus. Amendment 40 amends Clause 12(7) to ensure that the tools which will allow adult users to filter out content from non-verified users are effective and I am pleased to add my name to it.
Amendment 41 seeks to make it so that users can see whether another user is verified or not. I am afraid we are not minded to accept it. While I appreciate the intent, forcing users to show whether they are verified or not may have unintended consequences for those who are unable to verify themselves for perfectly legitimate reasons. This risks creating a two-tier system online. Users will still be able to set a preference to reduce their interaction with non-verified users without making this change.
Amendment 141 seeks to prescribe a set of principles and standards in Ofcom’s guidance on user verification. It is, however, important that Ofcom has discretion to determine, in consultation with relevant persons, which principles will have the best outcomes for users, while ensuring compliance with the duties. Further areas of the Bill also address several issues raised in this amendment. For example, all companies in scope will have a specific legal duty to have effective user reporting and redress mechanisms.
Existing laws also ensure that Ofcom’s guidance will reflect high standards. For example, it is a general duty of Ofcom under Section 3 of the Communications Act 2003 to further the interests of consumers, including by promoting competition. This amendment would, in parts, duplicate existing duties and undermine Ofcom’s independence to set standards on areas it deems relevant after consultation with expert groups.
Amendment 303 would add a definition of user identity verification. The definition it proposes would result in users having to display their real name online if they decide to verify themselves. In answer to the noble Baroness’s question, the current requirements do not specify that users must display their real name. The amendment would have potential safety implications for vulnerable users, for example victims and survivors of domestic abuse, whistleblowers and others of whom noble Lords have given examples in their contributions. The proposed definition would also create reliance on official forms of identification. That would be contrary to the existing approach in Clause 57 which specifically sets out that verification need not require such forms of documentation.
The noble Baroness, Lady Kidron, talked about paid-for verification schemes. The user identity verification provisions were brought in to ensure that adult users of the largest services can verify their identity if they so wish. These provisions are different from the blue tick schemes and others currently in place, which focus on a user’s status rather than verifying their identity. Clause 57 specifically sets out that providers of category 1 services will be required to offer all adult users the option to verify their identity. Ofcom will provide guidance for user identity verification to assist providers in complying with these duties. In doing so, it will consult groups that represent the interests of vulnerable adult users. In setting out recommendations about user verification, Ofcom must have particular regard to ensuring that providers of category 1 services offer users a form of identity verification that is likely to be available to vulnerable adult users. Ofcom will also be subject to the public sector equality duty, so it will need to take into account the ways in which people with certain characteristics may be affected when it performs this and all its duties under the Bill.
A narrow definition of identity verification could limit the range of measures that service providers might offer their users in the future. Under the current approach, Ofcom will produce and publish guidance on identity verification after consulting those with technical expertise and groups which represent the interests of vulnerable adult users.
Yes. The blue tick is certainly not identity verification. I will write to confirm on Meta, but they are separate and, as the example of blue ticks and Twitter shows, a changing feast. That is why I am talking in general terms about the approach, so as not to rely too much on examples that are changing even in the course of this Committee.
Government Amendment 43A stands in my name. This clarifies that “non-verified user” refers to users whether they are based in the UK or elsewhere. This ensures that, if a UK user decides he or she no longer wishes to interact with non-verified users, this will apply regardless of where they are based.
Finally, Amendment 106 in the name of my noble friend Lady Buscombe would make an addition to the online safety objectives for regulated user-to-user services. It would amend them to make it clear that one of the Bill’s objectives is to protect people from communications offences committed by anonymous users.
The Bill already imposes duties on services to tackle illegal content. Those duties apply across all areas of a service, including the way it is designed and operated. Platforms will be required to take measures—for instance, changing the design of functionalities, algorithms, and other features such as anonymity—to tackle illegal content.
Ofcom is also required to ensure that user-to-user services are designed and operated to protect people from harm, including with regard to functionalities and other features relating to the operation of their service. This will likely include the use of anonymous accounts to commit offences in the scope of the Bill. My noble friend’s amendment is therefore not needed. I hope she will be satisfied not to press it, along with the other noble Lords who have amendments in this group.
My Lords, I would like to say that that was a rewarding and fulfilling debate in which everyone heard very much what they wanted to hear from my noble friend the Minister. I am afraid I cannot say that. I think it has been one of the most frustrating debates I have been involved in since I came into your Lordships’ House. However, it gave us an opportunity to admire the loftiness of manner that the noble Lord, Lord Clement-Jones, brought to dismissing my concerns about Wikipedia—that I was really just overreading the whole thing and that I should not be too bothered with words as they appear in the Bill because the noble Lord thinks that Wikipedia is rather a good thing and why is it not happy with that as a level of assurance?
I would like to think that the Minister had dealt with the matter in the way that I hoped he would, but I do thin, if I may say so, that it is vaguely irresponsible to come to the Dispatch Box and say, “I don’t think Wikipedia will qualify as a category 1 service”, and then refuse to say whether it will or will not and take refuge in the process the Bill sets up, when at least one Member of the House of Lords, and possibly a second in the shape of the noble Lord, Lord Clement-Jones, would like to know the answer to the question. I see a Minister from the business department sitting on the Front Bench with my noble friend. This is a bit like throwing a hand grenade into a business headquarters, walking away and saying, “It was nothing to do with me”. You have to imagine what the position is like for the business.
We had a very important amendment from my noble friend Lady Buscombe. I think we all sympathise with the type of abuse that she is talking about—not only its personal effects but its deliberate business effects, the deliberate attempt to destroy businesses. I say only that my reading of her Amendment 106 is that it seeks to impose on Ofcom an objective to prevent harm, essentially, arising from offences under Clauses 160 and 162 of the Bill committed by unverified or anonymous users. Surely what she would want to say is that, irrespective of verification and anonymity, one would want action taken against this sort of deliberate attempt to undermine and destroy businesses. While I have every sympathy with her amendment, I am not entirely sure that it relates to the question of anonymity and verification.
Apart from that, there were in a sense two debates going on in parallel in our deliberations. One was to do with anonymity. On that question, I think the noble Lord, Lord Clement-Jones, put the matter very well: in the end, you have to come down on one side or the other. My personal view, with some reluctance, is that I have come down on the same side as the Government, the noble Lord and others. I think we should not ban anonymity because there are costs and risks to doing so, however satisfying it would be to be able to expose and sue some of the people who say terrible and untrue things about one another on social media.
The more important debate was not about anonymity as such but about verification. We had the following questions, which I am afraid I do not think were satisfactorily answered. What is verification? What does it mean? Can we define what verification is? Is it too expensive? Implicitly, should it be available for free? Is there an obligation for it to be free or do the paid-for services count, and what happens if they are so expensive that one cannot reasonably afford them? Is it real, in the sense that the verification processes devised by the various platforms genuinely provide verification? Various other questions like that came up but I do not think that any of them was answered.
I hate to say this as it sounds a little harsh about a Government whom I so ardently support, but the truth is that the triple shield, also referred to as a three-legged stool in our debate, was hastily cobbled together to make up for the absence of legal but harmful, but it is wonky; it is not working, it is full of holes and it is not fit for purpose. Whatever the Minister says today, there has to be a rethink before he comes back to discuss these matters at the next stage of the Bill. In the meantime, I beg leave to withdraw my amendment.
My Lords, I hung back in the hope that the noble and learned Lord, Lord Hope of Craighead, would speak before me, because I suspected that his remarks would help elucidate my amendments, as I believe they have. I have a large number of amendments in this group, but all of them, with one exception, work together as, effectively, a single amendment. They are Amendments 101, 102, 109, 112, 116, 121, 191 and 220. The exception is Amendment 294, to which the noble Baroness, Lady Fox of Buckley, alluded and to which I shall return in a moment.
Taking that larger group of amendments first, I can describe their effect relatively briefly. In the Bill, there are requirements on services to consider how their practices affect freedom of expression, but there is no equivalent explicit duty on the regulator, Ofcom, to have regard to freedom of expression.
These amendments, taken together, would require Ofcom to
“have special regard to freedom of expression”
within the law when designing codes of practice, writing guidance and undertaking enforcement action. They would insert a new clause requiring Ofcom to have special regard to rights to freedom of expression within the law in preparing a code of practice; they would also require Ofcom, when submitting a draft code to the Secretary of State, to submit a statement setting out it had complied with the duty imposed by that new requirement; and they would require the Secretary of State to submit that statement to Parliament when laying a draft code before Parliament. They would impose similar obligations on Ofcom and the Secretary of State when making amendments to codes that might be made later. Finally, they would have a similar effect relating to guidance issued by Ofcom.
It is so glaringly obvious that Ofcom should be under this duty that it must be a mere omission that the balancing, corresponding duty has not been placed on it that has been placed on the providers. I would hope, though experience so far in Committee does not lead me to expect, that my noble friend would accept this, and that it would pass relatively uncontroversially.
(1 year, 6 months ago)
Lords ChamberMy Lords, I was not going to speak on this group, but I was provoked into offering some reflections on the speech by the noble Lord, Lord Russell of Liverpool, especially his opening remarks about cars and planes, which he said were designed to be safe. He did not mention trains, about which I know something as well, and which are also designed to be safe. These are a few initial reflective points. They are designed in very different ways. An aeroplane is designed never to fail; a train is designed so that if it fails, it will come to a stop. They are two totally different approaches to safety. Simply saying that something must be designed to be safe does not answer questions; it opens questions about what we actually mean by that. The noble Lord went on to say that we do not allow children to drive cars and fly planes. That is absolutely true, but the thrust of his amendment is that we should design the internet so that it can be driven by children and used by children— so that it is designed for them, not for adults. That is my problem with the general thrust of many of these amendments.
A further reflection that came to mind as the noble Lord spoke was on a book of great interest that I recommend to noble Lords. It is a book by the name of Risk written in 1995 by Professor John Adams, then professor of geography at University College London. He is still an emeritus professor of geography there. It was a most interesting work on risk. First, it reflected how little we actually know of many of the things of which we are trying to assess risk.
More importantly, he went on to say that people have an appetite for risk. That appetite for risk—that risk budget, so to speak—changes over the course of one’s life: one has much less appetite for risk when one gets to a certain age than perhaps one had when one was young. I have never bungee jumped in my life, and I think I can assure noble Lords that the time has come when I can say I never shall, but there might have been a time when I was younger when I might have flung myself off a cliff, attached to a rubber band and so forth—noble Lords may have done so. One has an appetite for risk.
The interesting thing that he went on to develop from that was the notion of risk compensation: that if you have an appetite for risk and your opportunities to take risks are taken away, all you do is compensate by taking risks elsewhere. So a country such as New Zealand, which has some of the strictest cycling safety laws, also has a very high incidence of bungee jumping among the young; as they cannot take risks on their bicycles, they will find ways to go and do it elsewhere.
Although these reflections are not directly germane to the amendments, they are important as we try to understand what we are seeking to achieve here, which is a sort of hermetically sealed absence of risk for children. I do not think it will work. I said at Second Reading that I thought the flavour of the debate was somewhat similar to a late medieval conclave of clerics trying to work out how to mitigate the harmful effects of the invention of movable type. That did not work either, and I think we are in a very similar position today as we discuss this.
There is also the question of harm and what it means. While the examples being given by noble Lords are very specific and no doubt genuinely harmful, and are the sorts of things that we should like to stop, the drafting of the amendments, using very vague words such as “harm”, is dangerous overreach in the Bill. To give just one example, for the sake of speed, when I was young, administering the cane periodically was thought good for a child in certain circumstances. The mantra was, “Spare the rod and spoil the child”, though I never heard it said. Nowadays, we would not think it morally or psychologically good to do physical harm to a child. We would regard it as an unmitigated harm and, although not necessarily banned or illegal, it is something that—
My Lords, I respond to the noble Lord in two ways. First, I ask him to reflect on how the parents of the children who have died through what the parents would undoubtedly view as serious and unbearable harm would feel about his philosophical ruminations. Secondly, as somebody who has the privilege of being a Deputy Speaker in your Lordships’ House, it is incumbent and germane for us all to focus on the amendment in question and stay on it, to save time and get through the business.
Well, I must regard myself as doubly rebuked, and unfairly, because my reflections are very relevant to the amendments, and I have developed them in that direction. In respect of the parents, they have suffered very cruelly and wrongly, but although it may sound harsh, as I have said in this House before on other matters, hard cases make bad law. We are in the business of trying to make good law that applies to the whole population, so I do not think that these are wholly—
If my noble friend could, would he roll back the health and safety regulations for selling toys, in the same way that he seems so happy to have no health and safety regulations for children’s access to digital toys?
My Lords, if the internet were a toy, aimed at children and used only by children, those remarks would of course be very relevant, but we are dealing with something of huge value and importance to adults as well. It is the lack of consideration of the role of adults, the access for adults and the effects on freedom of expression and freedom of speech, implicit in these amendments, that cause me so much concern.
I seem to have upset everybody. I will now take issue with and upset the noble Baroness, Lady Benjamin, with whom I have not engaged on this topic so far. At Second Reading and earlier in Committee, she used the phrase, “childhood lasts a lifetime”. There are many people for whom this is a very chilling phrase. We have an amendment in this group—a probing amendment, granted—tabled by the noble Lord, Lord Knight of Weymouth, which seeks to block access to VPNs as well. We are in danger of putting ourselves in the same position as China, with a hermetically sealed national internet, attempting to put borders around it so that nobody can breach it. I am assured that even in China this does not work and that clever and savvy people simply get around the barriers that the state has erected for them.
Before I sit down, I will redeem myself a little, if I can, by giving some encouragement to the noble Baroness, Lady Kidron, on Amendments 28 and 32 —although I think the amendments are in the name of the noble Lord, Lord Russell of Liverpool. These amendments, if we are to assess the danger posed by the internet to children, seek to substitute an assessment of the riskiness of the provider for the Government’s emphasis on the size of the provider. As I said earlier in Committee, I do not regard size as being a source of danger. When it comes to many other services— I mentioned that I buy my sandwich from Marks & Spencer as opposed to a corner shop—it is very often the bigger provider I feel is going to be safer, because I feel I can rely on its processes more. So I would certainly like to hear how my noble friend the Minister responds on that point in relation to Amendments 28 and 32, and why the Government continue to put such emphasis on size.
More broadly, in these understandable attempts to protect children, we are in danger of using language that is far too loose and of having an effect on adult access to the internet which is not being considered in the debate—or at least has not been until I have, however unwelcomely, raised it.
My Lords, I assure your Lordships that I rise to speak very briefly. I begin by reassuring my noble friend Lord Moylan that he is loved in this Chamber and outside. I was going to say that he is the grit in the oyster that ensures that a consensus does not establish itself and that we think hard about these amendments, but I will revise that and say he is now the bungee jumper in our ravine. I think he often makes excellent and worthwhile points about the scope and reach of the Bill and the unintended consequences. Indeed, we debated those when we debated the amendments relating to Wikipedia, for example.
Obviously, I support these amendments in principle. The other reason I wanted to speak was to wish the noble Baroness, Lady Kidron—Beeban—a happy birthday, because I know that these speeches will be recorded on parchment bound in vellum and presented to her, but also to thank her for all the work that she has done for many years now on the protection of children’s rights on the internet. It occurred to me, as my noble friend Lady Harding was speaking, that there were a number of points I wanted to seek clarity on, either from the Minister or from the proponents of the amendments.
First, the noble Baroness, Lady Harding, mentioned the age-appropriate design code, which was a victory for the noble Baroness, Lady Kidron. It has, I think, already had an impact on the way that some sites that are frequented by children are designed. I know, for instance, that TikTok—the noble Baroness will correct me—prides itself on having made some changes as a result of the design code; for example, its algorithms are able, to a certain extent, to detect whether a child is under 13. I know anecdotally that children under 13 sometimes do have their accounts taken away; I think that is a direct result of the amendments made by the age-appropriate design code.
I would like to understand how these amendments, and the issue of children’s rights in this Bill, will interact with the age-appropriate design code, because none of us wants the confetti of regulations that either overlap or, worse, contradict themselves.
Secondly, I support the principle of functionality. I think it is a very important point that these amendments make: the Bill should not be focused solely on content but should take into account that functionality leads to dangerous content. That is an important principle on which platforms should be held to account.
Thirdly, going back to the point about the age-appropriate design code, the design of websites is extremely important and should be part of the regulatory system. Those are the points I wanted to make.
(1 year, 7 months ago)
Lords ChamberMy Lords, this is a very large and wide-ranging group of amendments. Within it, I have a number of amendments that, on their own, span three separate subjects. I propose to address these one after the other in my opening remarks, but other subjects will be brought in as the debate continues and other noble Lords speak to their own amendments.
If I split the amendments that I am speaking to into three groups, the first is Amendments 17 and 18. These relate to Clause 9, on page 7, where safety duties about illegal content are set out. The first of those amendments addresses the obligation to prevent individuals encountering priority illegal content by means of the service.
Earlier this week in Committee, I asked the Minister whether the Government understood “prevent” and “protect”, both of which they use in the legislation, to have different weight. I did not expect my noble friend to give an answer at that point, but I know that he will have reflected on it. We need clarity about this at some point, because courts will be looking at, listening to and reading what the Government say at the Dispatch Box about the weight to be given to these words. To my mind, to prevent something happening requires active measures in advance that ensure as far as reasonably and humanly possible that it does not actually happen, but one could be talking about something more reactive to protect someone from something happening.
This distinction is of great importance to internet companies—I am not talking about the big platforms—which will be placed, as I say repeatedly, under very heavy burdens by the Bill. It is possible that they simply will not be able to discharge them and will have to go out of business.
Let us take Wikipedia, which was mentioned earlier in Committee. It operates in 300 languages but employs 700 moderators globally to check what is happening. If it is required by Clause 9 to
“prevent individuals from encountering priority illegal content by means of the service”,
it will have to scrutinise what is put up on this community-driven website as or before it appears. Quite clearly, something such as Welsh Wikipedia—there is Wikipedia in Welsh—simply would not get off the ground if it had to meet that standard, because the number of people who would have to be employed to do that would be far more than the service could sustain. However, if we had something closer to the wording I suggest in my amendment, where services have to take steps to “protect” people—so they could react to something and take it down when they become aware of it—it all becomes a great deal more tolerable.
Similarly, Amendment 18 addresses subsection (3) of the same clause, where there is a
“duty to operate a service using proportionate systems and processes … to … minimise the length of time”
for which content is present. How do you know whether you are minimising the length of time? How is that to be judged? What is the standard by which that is to be measured? Would it not be a great deal better and more achievable if the wording I propose, which is that you simply are under an obligation to take it down, were inserted? That is my first group of amendments. I put that to my noble friend and say that all these amendments are probing to some extent at this stage. I would like to hear how he thinks that this can actually be operated.
My second group is quite small, because it contains only Amendment 135. Here I am grateful to the charity JUSTICE for its help in drawing attention to this issue. This amendment deals with Schedule 7, on page 202, where the priority offences are set out. Paragraph 4 of the schedule says that a priority offence includes:
“An offence under any of the following provisions of the Public Order Act 1986”.
One of those is Section 5 of that Act, “Harassment, alarm or distress”. Here I make a very different point and return to territory I have been familiar with in the past. We debated this only yesterday in Grand Committee, although I personally was unable to be there: the whole territory of hate crimes, harmful and upsetting words, and how they are to be judged and dealt with. In this case, my amendment would remove Section 5 of the Public Order Act from the list of priority offences.
If society has enough problems tolerating the police going round and telling us when we have done or said harmful and hurtful things and upbraiding us for it, is it really possible to consider—without the widest form of censorship—that it is appropriate for internet platforms to judge us, shut us down and shut down our communications on the basis of their judgment of what we should be allowed to say? We already know that there is widespread suspicion that some internet platforms are too quick to close down, for example, gender critical speech. We seem to be giving them something close to a legislative mandate to be very trigger-happy when it comes to closing down speech by saying that it engages, or could engage, Section 5 of the Public Order Act. I will come to the question of how they judge it in my third group, in a moment—but the noble Lord might be able to help me.
Just to reinforce the point the noble Lord, Lord Moylan, made on that, I certainly had experience of where the police became the complainants. They would request, for example, that you take down an English Defence League event, claiming that it would be likely to cause a public order problem. I have no sympathy whatever with the English Defence League, but I am very concerned about the police saying “You must remove a political demonstration” to a platform and citing the legal grounds for doing that. The noble Lord is on to a very valid point to be concerned about that.
I am grateful to the noble Lord. I really wonder whether the Government realise what they are walking into here. On the one hand, yesterday the Grand Committee was debating the statutory instrument putting in place new statutory guidance for the police on how to enforce, much more sensitively than in the past, non-crime hate incidents. However, on the other hand, the next day in this Chamber we are putting an obligation on a set of mostly foreign private companies to act as a police force to go around bullying us and closing us down if we say something that engages Section 5 of the Public Order Act. I think this is something the Government are going to regret, and I would very much like to hear what my noble friend has to say about that.
Finally, I come to my third group of amendments: Amendments 274, 278, 279 and 283. They are all related and on one topic. These relate to the text of the Bill on page 145, in Clause 170. Here we are discussing what judgments providers have to make when they come to decide what material to take down. Inevitably, they will have to make judgments. That is one of the unfortunate things about this Bill. A great deal of what we do in our lives is going to have to be based on judgments made by private companies, many of which are based abroad but which we are trying to legislate for.
It makes a certain sense that the law should say what they should take account of in making those judgments. But the guidance—or rather, the mandate—given to those companies by Clause 170 is, again, very hair-trigger. Clause 170(5), which I am proposing we amend, states:
“In making such judgements, the approach to be followed is whether a provider has reasonable grounds to infer that content is … of the kind in question”.
I am suggesting that “reasonable grounds to infer” should be replaced with “sufficient evidence to infer”, so that they have to be able to produce some evidence that they are justified in taking content down. The test should be higher than simply having “reasonable grounds”, which may rest on a suspicion and little evidence at all. So one of those amendments relates to strengthening that bar so that they must have real evidence before they can take censorship action.
I add only two words to subsection (6), which talks about reasonable grounds for the inference—it defines what the reasonable grounds are—that
“exist in relation to content and an offence if, following the approach in subsection (2)”
and so on. I am saying “if and only if”—in other words, I make it clear that this is the only basis on which material can be censored using the provisions in this section, so as to limit it from going more widely. The third amendment in my group is essentially consequential to that.
I am struggling a little to understand why the Minister thinks that sufficient evidence is subjective, and therefore, I assume, reasonable grounds to infer is objective. Certainly, in my lexicon, evidence is more objective than inference, which is more subjective. I was reacting to that word. I am not sure that he has fully made the case as to why his wording is better.
I take the noble Lord’s point and my noble friend’s further contribution. I will see whether I can give a clearer and more succinct description in writing to flesh that out, but that it is the reason that we have alighted on the words that we have.
The noble Lord, Lord Allan, also asked about jurisdiction. If an offence has been committed in the UK and viewed by a UK user, it can be treated as illegal content. That is set out in Clause 53(11), which says:
“For the purposes of determining whether content amounts to an offence, no account is to be taken of whether or not anything done in relation to the content takes place in any part of the United Kingdom”.
I hope that that bit, at least, is clearly set out to the noble Lord’s satisfaction. It looks like it may not be.
If it has been committed in the UK and is viewed by a UK user, it can be treated as illegal. I will follow up on the noble Lord’s further points ahead of the next stage.
Amendment 272 explicitly provides that relevant information that is reasonably available to a provider includes information submitted by users in complaints. Providers will already need to do this when making judgments about content, as it will be both relevant and reasonably available.
My noble friend Lord Moylan returned to the question that arose on day 2 in Committee, querying the distinction between “protect” and “prevent”, and suggesting that a duty to protect would or could lead to the excessive removal of content. To be clear, the duty requires platforms to put in place proportionate systems and processes designed to prevent users encountering content. I draw my noble friend’s attention to the focus on systems and processes in that. This requires platforms to design their services to achieve the outcome of preventing users encountering such content. That could include upstream design measures, as well as content identification measures, once content appears on a service. By contrast, a duty to protect is a less stringent duty and would undermine the proactive nature of the illegal content duties for priority offences.
Before he moves on, is my noble friend going to give any advice to, for example, Welsh Wikipedia, as to how it will be able to continue, or are the concerns about smaller sites simply being brushed aside, as my noble friend explicates what the Bill already says?
I will deal with all the points in the speech. If I have not done so by the end, and if my noble friend wants to intervene again, I would be more than happy to hear further questions, either to answer now or write to him about.
Amendments 128 to 133 and 143 to 153, in the names of the right reverend Prelate the Bishop of Derby and the noble Lord, Lord Stevenson of Balmacara, seek to ensure that priority offences relating to modern slavery and human trafficking, where they victimise children, are included in Schedule 6. These amendments also seek to require technology companies to report content which relates to modern slavery and the trafficking of children—including the criminal exploitation of children—irrespective of whether it is sexual exploitation or not. As noble Lords know, the strongest provisions in the Bill relate to children’s safety, and particularly to child sexual exploitation and abuse content. These offences are captured in Schedule 6. The Bill includes a power for Ofcom to issue notices to companies requiring them to use accredited technology or to develop new technology to identify, remove and prevent users encountering such illegal content, whether communicated publicly or privately.
These amendments would give Ofcom the ability to issue such notices for modern slavery content which affects children, even when there is no child sexual exploitation or abuse involved. That would not be appropriate for a number of reasons. The power to tackle illegal content on private communications has been restricted to the identification of content relating to child sexual exploitation and abuse because of the particular risk to children posed by content which is communicated privately. Private spaces online are commonly used by networks of criminals to share illegal images—as we have heard—videos, and tips on the commitment of these abhorrent offences. This is highly unlikely to be reported by other offenders, so it will go undetected if companies do not put in place measures to identify it. Earlier in Committee, the noble Lord, Lord Allan, suggested that those who receive it should report it, but of course, in a criminal context, a criminal recipient would not do that.
Extending this power to cover the identification of modern slavery in content which is communicated privately would be challenging to justify and could represent a disproportionate intrusion into someone’s privacy. Furthermore, modern slavery is usually identified through patterns of behaviour or by individual reporting, rather than through content alone. This reduces the impact that any proactive technology required under this power would have in tackling such content. Schedule 6 already sets out a comprehensive list of offences relating to child sexual exploitation and abuse which companies must tackle. If these offences are linked to modern slavery—for example, if a child victim of these offences has been trafficked—companies must take action. This includes reporting content which amounts to an offence under Schedule 6 to the National Crime Agency or another reporting body outside of the UK.
My noble friend Lord Moylan’s Amendment 135 seeks to remove the offence in Section 5 of the Public Order Act 1986 from the list of priority offences. His amendment would mean that platforms were not required to take proactive measures to reduce the risk of content which is threatening or abusive, and intended to cause a user harassment, alarm or distress, from appearing on their service. Instead, they would be obliged to respond only once they are made aware of the content, which would significantly reduce the impact of the Bill’s framework for tackling such threatening and abusive content. Given the severity of the harm which can be caused by that sort of content, it is right that companies tackle it. Ofcom will have to include the Public Order Act in its guidance about illegal content, as provided for in Clause 171.
Government Amendments 136A to 136C seek to strengthen the illegal content duties by adding further priority offences to Schedule 7. Amendments 136A and 136B will add human trafficking and illegal entry offences to the list of priority offences in the Bill. Crucially, this will mean that platforms will need to take proactive action against content which encourages or assists others to make dangerous, illegal crossings of the English Channel, as well as those who use social media to arrange or facilitate the travel of another person with a view to their exploitation.
The noble Lord, Lord Allan, asked whether these amendments would affect the victims of trafficking themselves. This is not about going after the victims. Amendment 136B addresses only content which seeks to help or encourage the commission of an existing immigration offence; it will have no impact on humanitarian communications. Indeed, to flesh out a bit more detail, Section 2 of the Modern Slavery Act makes it an offence to arrange or facilitate the travel of another person, including through recruitment, with a view to their exploitation. Facilitating a victim’s travel includes recruiting them. This offence largely appears online in the form of advertisements to recruit people into being exploited. Some of the steps that platforms could put in place include setting up trusted flagger programmes, signposting users to support and advice, and blocking known bad actors. Again, I point to some of the work which is already being done by social media companies to help tackle both illegal channel crossings and human trafficking.
My Lords, it is genuinely difficult to summarise such a wide-ranging debate, which was of a very high standard. Only one genuinely bright idea has emerged from the whole thing: as we go through Committee, each group of amendments should be introduced by the noble Lord, Lord Allan of Hallam, because it is only after I have heard his contribution on each occasion that I have begun to understand the full complexity of what I have been saying. I suspect I am not alone in that and that we could all benefit from hearing the noble Lord before getting to our feet. That is not meant to sound the slightest bit arch; it is absolutely genuine.
The debate expressed a very wide range of concerns. Concerns about gang grooming and recruiting were expressed on behalf of the right reverend Prelate the Bishop of Derby and my noble friend Lady Buscombe expressed concerns about trolling of country businesses. However, I think it is fair to say that most speakers focused on the following issues. The first was the definition of legality, which was so well explicated by the noble Lord, Lord Allan of Hallam. The second was the judgment bar that providers have to pass to establish whether something should be taken down. The third was the legislative mandating of private foreign companies to censor free speech rights that are so hard-won here in this country. These are the things that mainly concern us.
I was delighted that I found myself agreeing so much with what the noble Baroness, Lady Kidron, said, even though she was speaking in another voice or on behalf of another person. If her own sentiments coincide with the sentiments of the noble Viscount—
I am sorry to intrude, but I must say now on the record that I was speaking on my own behalf. The complication of measuring and those particular things are terribly important to establish, so I am once again happy to agree with the noble Lord.
I am delighted to hear the noble Baroness say that, and it shows that that pool of common ground we share is widening every time we get to our feet. However, the pool is not particularly widening, I am afraid to say—at least in respect of myself; other noble Lords may have been greatly reassured—as regards my noble friend the Minister who, I am afraid, has not in any sense addressed the issues about free speech that I and many other noble Lords raised. On some issues we in the Committee are finding a consensus that is drifting away from the Minister. We probably need to put our heads together more closely on some of these issues with the passage of time in Committee.
My noble friend also did not say anything that satisfied me in respect of the practical operation of these obligations for smaller sites. He speaks smoothly and persuasively of risk-based proactive approaches without saying that, for a large number of sites, this legislation will mean a complete re-engineering of their business model. For example, where Wikipedia operates in a minority language, such as in Welsh Wikipedia, which is the largest Welsh language website in the world, if its model is to involve monitoring what is put out by the community and correcting it as it goes along, rather than having a model in advance that is designed to prevent things being put there in the first place, then it is very likely to close down. If that is one of the consequences of this Bill the Government will soon hear about it.
Finally, although I remain concerned about public order offences, I have to say to the Minister that if he is so concerned about the dissemination of alarm among the population under the provisions of the Public Order Act, what does he think that His Majesty’s Government were doing on Sunday at 3 pm? I beg leave to withdraw the amendment.
(1 year, 7 months ago)
Lords ChamberMy Lords, in moving my Amendment 13 I will speak to all the amendments in the group, all of which are in my name with the exception of Amendment 157 in the name of my noble friend Lord Pickles. These are interlinked amendments; they work together. There is effectively only one amendment going on. A noble Lord challenged me a day or two ago as to whether I could summarise in a sentence what the amendment does, and the answer is that I think I can: Clause 23 imposes various duties on search engines, and this amendment would remove one of those duties from search engines that fall into category 2B.
There are two categories of search engines, 2A and 2B, and category 2B is the smaller search engines. We do not know the difference between them in greater detail than that because the schedule that relates to them reserves to the Secretary of State the power to set the thresholds that will define which category a search engine falls into, but I think it is clear that category 2B is the smaller ones.
These amendments pursue a theme that I brought up in Committee earlier in the week when I argued that the Bill would put excessively onerous and unnecessary obligations on smaller businesses. The particular duty that these amendments would take away from smaller search engines is referred to in Clause 23(2):
“A duty, in relation to a service, to take or use proportionate measures relating to the design or operation of the service to effectively mitigate and manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service”.
The purpose of that is to recognise that very large numbers of smaller businesses do not pose a risk, according to the Government’s own assessment of the market, and to allow them to get on with their business without taking these onerous and difficult measures. They are probing amendments to try to find out what the Government are willing to do in relation to smaller businesses that will make this a workable Bill.
I can already imagine that there are noble Lords in the Chamber who will say that small does not equal safe, and that small businesses need to be covered by the same rigorous regulations as larger businesses. But I am not saying that small equals safe. I am saying—as I attempted to say when the Committee met earlier—that absolute safety is not attainable. It is not attainable in the real world, nor can we expect it to be attainable in the online world. I imagine that objection will be made. I see it has some force, but I do not think it has sufficient compelling force to put the sort of burden on small businesses that this Bill would do, and I would like to hear more about it.
I will say one other thing. Those who object to this approach need to be sure in their own minds that they are not contributing to creating a piece of legislation that, when it comes into operation, is so difficult to implement that it becomes discredited. There needs to be a recognition that this has to work in practice. If it does not—if it creates resentment and opposition—we will find the Government not bringing sections of it into force, needing to repeal them or going easy on them once the blowback starts, so to speak. With that, I beg to move.
My Lords, I will speak to Amendment 157 in the name of the noble Lord, Lord Pickles, and others, since the noble Lord is unavoidably absent. It is along the same lines as Amendment 13; it is relatively minor and straightforward, and asks the Government to recognise that search services such as Google are enormously important as an entry to the internet. They are different from social media companies such as Twitter. We ask that the Government be consistent in applying their stated terms when these are breached in respect of harm to users, whether that be through algorithms, through auto-prompts or otherwise.
As noble Lords will be aware, the Bill treats user-to-user services, such as Meta, and search services, such as Google, differently. The so-called third shield or toggle proposed for shielding users from legal but harmful content, should they wish to be shielded, does not apply when it comes to search services, important though they are. Indeed, at present, large, traditional search services, including Google and Microsoft Bing, and voice search assistants, including Alexa and Siri, will be exempted from several of the requirements for large user-to-user services—category 1 companies. Why the discrepancy? Though search services rightly highlight that the content returned by a search is not created or published by them, the algorithmic indexing, promotion and search prompts provided in search bars—the systems they design and employ—are their responsibility, and these have been proven to do harm.
Some of the examples of such harm have already been cited in the other place, but not before this Committee. I do not want to give them too much of an airing because they were in the past, and the search people have taken them down after complaints, but some of the dreadful things that emerge from searching on Google et cetera are a warning of what could occur. It has been pointed out that search engines would in the past have thrown up, for example, swastikas, SS bolts and other Nazi memorabilia when people searched for desk ornaments. If George Soros’s name came up, he would be included in a list of people responsible for world evils. The Bing service, which I dislike anyway, has been directing people—at last, it did in the past—to anti-Semitic and homophobic searches through its auto-complete, while Google’s image carousel highlighted pictures of portable barbecues to those searching for the term “Jewish baby stroller”.
My Lords, I am grateful to all noble Lords who have spoken in this debate. I hope that the noble Baroness, Lady Deech, and the noble Lord, Lord Weir of Ballyholme, will forgive me if I do not comment on the amendment they spoke to in the name of my noble friend Lord Pickles, except to say that of course they made their case very well.
I will briefly comment on the remarks of the noble Baroness, Lady Kidron. I am glad to see a degree of common ground among us in terms of definitions and so forth—a small piece of common ground that we could perhaps expand in the course of the many days we are going to be locked up together in your Lordships’ House.
I am grateful too to the noble Lord, Lord Allan of Hallam. I am less clear on “2B or not 2B”, if that is the correct way of referring to this conundrum, than I was before. The noble Baroness, Lady Kidron, said that size does not matter and that it is all about risk, but my noble friend the Minister cunningly conflated the two and said at various points “the largest” and “the riskiest”. I do not see why the largest are necessarily the riskiest. On the whole, if I go to Marks & Spencer as opposed to going to a corner shop, I might expect rather less risk. I do not see why the two run together.
I address the question of size in my amendment because that is what the Bill focuses on. I gather that the noble Baroness, Lady Kidron, may want to explore at some stage in Committee why that is the case and whether a risk threshold might be better than a size threshold. If she does that, I will be very interested in following and maybe even contributing to that debate. However, at the moment, I do not think that any of us is terribly satisfied with conflating the two—that is the least satisfactory way of explaining and justifying the structure of the Bill.
On the remarks of my noble friend Lady Harding of Winscombe, I do not want in the slightest to sound as if there is any significant disagreement between us—but there is. She suggested that I was opening the way to businesses building business models “not taking children into account at all”. My amendment is much more modest than that. There are two ways of dealing with harm in any aspect of life. One is to wait for it to arrive and then to address it as it arises; the other is constantly to look out for it in advance and to try to prevent it arising. The amendment would leave fully in place the obligation to remove harm, which is priority illegal content or other illegal content, that the provider knows about, having been alerted to it by another person or become aware of it in any other way. That duty would remain. The duty that is removed, especially from small businesses—and really this is quite important—is the obligation constantly to be looking out for harm, because it involves a very large, and I suggest possibly ruinous, commitment to constant monitoring of what appears on a search engine. That is potentially prohibitive, and it arises in other contexts in the Bill as well.
There should be occasions when we can say that knowing that harmful stuff will be removed as soon as it appears, or very quickly afterwards, is adequate for our purposes, without requiring firms to go through a constant monitoring or risk-assessment process. The risk assessment would have to be adjudicated by Ofcom, I gather. Even if no risk was found, of course, that would not be the end of the matter, because I am sure that Ofcom would, very sensibly, require an annual renewal of that application, or after a certain period, to make sure that things had not changed. So even to escape the burden is quite a large burden for small businesses, and then to implement the burden is so onerous that it could be ruinous, whereas taking stuff down when it appears is much easier to do.
Perhaps I might briefly come in. My noble friend Lord Moylan may have helped explain why we disagree: our definition of harm is very different. I am most concerned that we address the cumulative harms that online services, both user-to-user services and search, are capable of inflicting. That requires us to focus on the design of the service, which we need to do at the beginning, rather than the simpler harm that my noble friend is addressing, which is specific harmful content—not in the sense in which “content” is used in the Bill but “content” as in common parlance; that is, a piece of visual or audio content. My noble friend makes the valid point that that is the simplest way to focus on removing specific pieces of video or text; I am more concerned that we should not exclude small businesses from designing and developing their services such that they do not consider the broader set of harms that are possible and that add up to the cumulative harm that we see our children suffering from today.
So I think our reason for disagreement is that we are focusing on a different harm, rather than that we violently disagree. I agree with my noble friend that I do not want complex bureaucratic processes imposed on small businesses; they need to design their services when they are small, which makes it simpler and easier for them to monitor harm as they grow, rather than waiting until they have grown. That is because the backwards re-engineering of a technology stack is nigh-on impossible.
My noble friend makes a very interesting point, and there is much to ponder in it—too much to ponder for me to respond to it immediately. Since I am confident that the issue is going to arise again during our sitting in Committee, I shall allow myself the time to reflect on it and come back later.
While I understand my noble friend’s concern about children, the clause that I propose to remove is not specific to children; it relates to individuals, so it covers adults as well. I think I understand what my noble friend is trying to achieve—I shall reflect on it—but this Bill and the clauses we are discussing are a very blunt way of going at it and probably need more refinement even than the amendments we have seen tabled so far. But that is for her to consider.
I think this debate has been very valuable. I did not mention it, but I am grateful also for the contribution from the noble Baroness, Lady Merron. I beg leave to withdraw the amendment.
I am glad I gave the noble Baroness the opportunity for that intervention. I have a reasonable level of technical knowledge—I hand-coded my first website in 1999, so I go back some way—but given the structures we are dealing with, I question the capacity and whether it is possible to create the tools and say they will be used only in a certain way. If you break the door open, anyone can walk through the door—that is the situation we are in.
As the noble Lord, Lord Allan, said, this is a crucial part of the Bill that was not properly examined and worked through in the other place. I will conclude by saying that it is vital we have a full and proper debate in this area. I hope the Minister can reassure us that he and the department will continue to be in dialogue with noble Lords as the Bill goes forward.
My Lords, I rise to speak to Amendment 205 in my name, but like other noble Lords I will speak about the group as a whole. After the contributions so far, not least from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Bennett of Manor Castle, there is not a great deal left for me to add. However, I will say that we have to understand that privacy is contextual. At one extreme, I know the remarks I make in your Lordships’ House are going to be carefully preserved and cherished; for several centuries, if not millennia, people will be able to see what I said today. If I am in my sitting room, having a private conversation, I expect that not to be heard by somebody, although at the same time I am dimly aware that there might be somebody on the other side of the wall who can hear what I am saying. Similarly, I am aware that if I use the telephone, it is possible that somebody is listening to the call. Somebody may have been duly authorised to do so by reference to a tribunal, having taken all the lawful steps necessary in order to listen to that call, because there are reasons that have persuaded a competent authority that the police service, or whatever, listening to my telephone call has a reason to do so, to avoid public harm or meet some other justified objective agreed on through legislation.
Here, we are going into a sphere of encryption where one assumes privacy and feels one is entitled to some privacy. However, it is possible that the regulator could at any moment step in and demand records from the past—records up to that point—without the intervention of a tribunal, as far as I can see, or without any reference to a warrant or having to explain to anybody their basis for doing so. They would be able to step in and do it. This is the point made by the noble Baroness, Lady Bennett of Manor Castle: unlike the telephone conversation, where it does not have to be everyone, everywhere, all the time—they are listening to just me and the person to whom I am talking—the provider has to have the capacity to go back, get all those records and be able to show Ofcom what it is that Ofcom is looking for. To do that requires them to change their encryption model fundamentally. It is not really possible to get away from everyone, everywhere, all the time, because the model has to be changed in order to do it.
That is why this is such an astonishing thing for the Government to insert in this Bill. I can understand why the security services and so forth want this power, and this is a vehicle to achieve something they have been trying to achieve for a long time. But there is very strong public resistance to it, and it is entirely understood, and to do it in this space is completely at odds with the way in which we felt it appropriate to authorise listening in on private conversations in the past—specific conversations, with the authority of a tribunal. To do it this way is a very radical change and one that needs to be considered almost apart from the Bill, not slipped in as a mere clause and administrative adjunct to it.
My Lords, there have been some excellent speeches so far. The noble Lord, Lord Allan of Hallam, brilliantly laid out why these amendments matter, and the noble Lord, Lord Moylan, explained why this has gained popular interest outside of the House. Not everything that goes on in this House is of interest and people do not study all of the speeches made by the noble Lord, Lord Moylan, even though they are always in the public sphere, but this particular group of amendments has elicited a huge amount of discussion.
We should remember that encrypted chat has become an indispensable part of the way that we live in this country and around the world. According to the Open Rights Group it has replaced the old-fashioned wired telephone—a rather quaint phrase. The fact that the citizens of the United Kingdom think that chat services matter so much that they are used by 60% of the total population should make us think about what we are doing regarding these services.
End-to-end encryption—the most secure form of encryption available—means that your messages are stored on your phone; people feel that they are in control because they are not on some server somewhere. Even WhatsApp cannot read your WhatsApp messages; that is the point of encryption. That is why people use it: the messages are secured with a lock which only you and the recipient have the special key to unlock to read them.
Obviously, there are certain problems. Certain Government Ministers wanted to voluntarily share all of their WhatsApp messages with a journalist who would then share them with the rest of us. If your Lordships were in that group you might have thought that was a rude thing to do. People have their WhatsApp messages leaked all the time, and when it happens we all think, “Oh my God, I’m glad I wasn’t in that WhatsApp group”, because you assume a level of privacy, even though as a grown-up you need to remember that somebody might leak them. But the main point is that they are a secure form of conversation that is widely used.
Everyone has a right to private conversations. I was thinking about how, when society closed down during the lockdown period, we needed technology in order to communicate with each other. We understood that we needed to WhatsApp message or Zoom call our friends and family, and the idea that this would involve the state listening in would have appalled us—we considered that our private life.
We want to be able to chat in confidence and be confident that only ourselves and the recipients can see what we are sharing and hear what we are saying. That is true of everyday life, but there are very good specific cases to be made for its importance, ranging through everything from Iranian women resisting the regime and communicating with each other, to all the civil liberties organisations around the world that use WhatsApp. The security of knowing that you can speak without Putin listening in or that President Xi will not be sent your WhatsApp messages is important.
The Government keep assuring us that we do not need to worry, but the Bill gives Ofcom the power to require services to install tools that would require the surveillance of encrypted communications regarding child exploitation and terrorism content, for example. Advocates and people on my side argue that this is not possible without undermining encryption because, just as you cannot be half pregnant, you cannot be half encrypted once you install tools for scanning for certain content. There is a danger that we say, “We’re only doing it for those things”, but actually it would be an attack on encryption itself.
Unlike the noble Baroness, Lady Bennett of Manor Castle, I know nothing about the technical aspects of this, as noble Lords can hear from the way I am speaking about it. But I can see from a common-sense point of view what encryption is: you cannot say, “We’re only going to use it a little bit”. That is my point.
I want to tackle the issue of child abuse, because I know that it lurks around here. It is what really motivates the people who say, “It’s ok as long as we can deal with that”. This is put forward as a proposed solution to the problem of encrypted chat services that send messages of that nature and the question of what we can do about it. Of course I stress that images of child abuse and exploitation are abhorrent—that is a very important background to this conversation—but I want to draw attention to the question of what we are prepared to do about child abuse, because I think it was referred to in an earlier group. I am nervous that we are promising a silver bullet through this Bill that it will all be solved through some of these measures.
(1 year, 7 months ago)
Lords ChamberMy Lords, before I speak to my Amendment 9, which I will be able to do fairly briefly because a great deal of the material on which my case rests has already been given to the Committee by the noble Baroness, Lady Fox of Buckley, I will make the more general and reflective point that there are two different views in the Committee that somehow need to be reconciled over the next few weeks. There is a group of noble Lords who are understandably and passionately concerned about child safety. In fact, we all share that concern. There are others of us who believe that this Bill, its approach and the measures being inserted into it will have massive ramifications outside the field of child safety, for adults, of course, but also for businesses, as the noble Baroness explained. The noble Baroness and I, and others like us, believe that these are not sufficiently taken into account either by the Bill or by those pressing for measures to be harsher and more restrictive.
Some sort of balance needs to be found. At Second Reading my noble friend the Minister said that the balance had been struck in the right place. It is quite clear that nobody really agrees with that, except on the principle, which I think is always a cop-out, that if everyone disagrees with you, you must be right, which I have never logically understood in any sense at all. I hope my noble friend will not resort to claiming that he has got it right simply because everyone disagrees with him in different ways.
My amendment is motivated by the considerations set out by the noble Baroness, which I therefore do not need to repeat. It is the Government’s own assessment that between 20,000 and 25,000 businesses will be affected by the measures in this Bill. A great number of those—some four-fifths—are small businesses or micro-businesses. The Government appear to think in their assessment that only 120 of those are high risk. The reason they think they are high risk is not that they are engaged in unpleasant activities but simply that they are engaged in livestreaming and contacting new people. That might be for nefarious purposes but equally, it might not, so the 120 we need to worry about could actually be a very small number. We handle this already through our own laws; all these businesses would still be subject to existing data protection laws and complying with the law generally on what they are allowed to publish and broadcast. It would not be a free-for-all or a wild west, even among that very small number of businesses.
My Amendment 9 takes a slightly different approach to dealing with this. I do not in any way disagree with or denigrate the approach taken by the noble Baroness, Lady Fox, but my approach would be to add two categories to the list of exemptions in the schedules. The first of these is services provided by small and medium-sized enterprises. We do not have to define those because there is already a law that helps define them for us: Section 33 of the Small Business, Enterprise and Employment Act 2015. My proposal is that we take that definition, and that those businesses that comply with it be outside the scope of the Bill.
The second area that I would propose exempting was also referred to by the noble Baroness, Lady Fox of Buckley: community-based services. The largest of these, and the one that frequently annoys us because it gets things wrong, is Wikipedia. I am a great user of Wikipedia but I acknowledge that it does make errors. Of course, most of the errors it makes, such as saying, “Lord Moylan has a wart on the end of his nose”, would not be covered by the Bill anyway. Nothing in the Bill will force people to correct factual statements that have been got wrong—my year of birth or country of birth, or whatever. That is not covered. Those are the things they usually get wrong and that normally annoy us when we see them.
However, I do think that these services are extremely valuable. Wikipedia is an immense achievement and a tremendous source of knowledge and information for people. The fact that it has been put together in this organic, community-led way over a number of years, in so many languages, is a tremendous advantage and a great human advance. Yet, under the proposed changes, Wikipedia would not be able to operate its existing model of people posting their comments.
Currently, you go on Wikipedia and you can edit it. Now, I know this would not apply to any noble Lords but, in the other place, it has been suggested that MPs have discovered how to do this. They illicitly and secretly go on to and edit their own pages, usually in a flattering way, so it is possible to do this. There is no prior restraint, and no checking in advance. There are moderators at Wikipedia—I do not know whether they are employed—who review what has been done over a period, but they do not do what this Bill requires, which is checking in advance.
It is not simply about Wikipedia; there are other community sites. Is it sensible that Facebook should be responsible if a little old lady alters the information on a community Facebook page about what is happening in the local parish? Why should Facebook be held responsible for that? Why would we want it to be responsible for it—and how could it do it without effectively censoring ordinary activities that people want to carry out, using the advantages of the internet that have been so very great?
What I am asking is not dramatic. We have many laws in which we very sensibly create exemptions for small and medium-sized enterprises. I am simply asking that this law be considered under that heading as well, and similarly for Wikipedia and community-based sites. It is slightly unusual that we have had to consider that; it is not normal, but it is very relevant to this Bill and I very much hope the Government will agree to it.
The answer that I would not find satisfactory—I say this in advance for the benefit of my noble friend the Minister, in relation to this and a number of other amendments I shall be moving in Committee—is that it will all be dealt with by Ofcom. That would not be good enough. We are the legislators and we want to know how these issues will be dealt with, so that the legitimate objectives of the Bill can be achieved without causing massive disruption, cost and disbenefit to adults.
My Lords, I rise to speak in support of Amendment 9, tabled by the noble Lord, Lord Moylan, and in particular the proposed new paragraph 10A to Schedule 1. I hope I will find myself more in tune with the mood of the Committee on this amendment than on previous ones. I would be interested to know whether any noble Lords believe that Ofcom should be spending its limited resources supervising a site like Wikipedia under the new regime, as it seems to me patently obvious that that is not what we intend; it is not the purpose of the legislation.
The noble Lord, Lord Moylan, is right to remind us that one of the joys of the internet is that you buy an internet connection, plug it in and there is a vast array of free-to-use services which are a community benefit, produced by the community for the community, with no harm within them. What we do not want to do is interfere with or somehow disrupt that ecosystem. The noble Baroness, Lady Fox, is right to remind us that there is a genuine risk of people withdrawing from the UK market. We should not sidestep that. People who try to be law-abiding will look at these requirements and ask themselves, “Can I meet them?” If the Wikimedia Foundation that runs Wikipedia does not think it can offer its service in a lawful way, it will have to withdraw from the UK market. That would be to the detriment of children in the United Kingdom, and certainly not to their benefit.
There are principle-based and practical reasons why we do not want Ofcom to be operating in this space. The principle-based one is that it makes me uncomfortable that a Government would effectively tell their regulator how to manage neutral information sites such as Wikipedia. There are Governments around the world who seek to do that; we do not want to be one of those.
The amendment attempts to define this public interest, neutral, informational service. It happens to be user-to-user but it is not like Facebook, Instagram or anything similar. I would feel much more comfortable making it clear in law that we are not asking Ofcom to interfere with those kinds of services. The practical reason is the limited time Ofcom will have available. We do not want it to be spending time on things that are not important.
Definitions are another example of how, with the internet, it can often be extremely hard to draw bright lines. Functionalities bleed into each other. That is not necessarily a problem, until you try to write something into law; then, you find that your definition unintentionally captures a service that you did not intend to capture, or unintentionally misses out a service that you did intend to be in scope. I am sure the Minister will reject the amendment because that is what Ministers do; but I hope that, if he is not willing to accept it, he will at least look at whether there is scope within the Bill to make it clear that Wikipedia is intended to be outside it.
Paragraph 4 of Schedule 1 refers to “limited functionality services”. That is a rich vein to mine. It is clear that the intention is to exclude mainstream media, for example. It refers to “provider content”. In this context, Encyclopaedia Britannica is not in scope but Wikipedia is, the difference being that Wikipedia is constructed by users, while Encyclopaedia Britannica is regarded as being constructed by a provider. The Daily Mail is outside scope; indeed, all mainstream media are outside scope. Anyone who declares themselves to be media—we will debate this later on—is likely to be outside scope.
Such provider exemption should be offered to other, similar services, even if they happen to be constructed from the good will of users as opposed to a single professional author. I hope the Minister will be able to indicate that the political intent is not that we should ask Ofcom to spend time and energy regulating Wikipedia-like services. If so, can he point to where in the legislation we might get that helpful interpretation, in order to ensure that Ofcom is focused on what we want it to be focused on and not on much lower priority issues?
My Lords, while my noble friend is talking about the possibility of excessive and disproportionate burden on businesses, can I just ask him about the possibility of excessive and disproportionate burden on the regulator? He seems to be saying that Ofcom is going to have to maintain, and keep up to date regularly, 25,000 risk assessments—this is on the Government’s own assessment, produced 15 months ago, of the state of the market then—even if those assessments carried out by Ofcom result in very little consequence for the regulated entity.
We know from regulation in this country that regulators already cannot cope with the burdens placed on them. They become inefficient, sclerotic and unresponsive; they have difficulty in recruiting staff of the same level and skills as the entities that they regulate. We have a Financial Services and Markets Bill going through at the moment, and the FCA is a very good example of that. Do we really think that this is a sensible burden to place on a regulator that is actually able to discharge it?
The Bill creates a substantial new role for Ofcom, but it has already substantially recruited and prepared for the effective carrying out of that new duty. I do not know whether my noble friend was in some of the briefings with officials from Ofcom, but it is very happy to set out directly the ways in which it is already discharging, or preparing to discharge, those duties. The Government have provided it with further resource to enable it to do so. It may be helpful for my noble friend to have some of those discussions directly with the regulator, but we are confident that it is ready to discharge its duties, as set out in the Bill.
I was about to say that we have already had a bit of discussion on Wikipedia. I am conscious that we are going to touch on it again in the debate on the next group of amendments so, at the risk of being marked down for repetition, which is a black mark on that platform, I shall not pre-empt what I will say shortly. But I emphasise that the Bill does not impose prescriptive, one-size-fits-all duties on services. The codes of practice from Ofcom will set out a range of measures that are appropriate for different types of services in scope. Companies can follow their own routes to compliance, so long as they are confident that they are effectively managing risks associated with legal content and, where relevant, harm to children. That will ensure that services that already use community moderation effectively can continue to do so—such as Wikipedia, which successfully uses that to moderate content. As I say, we will touch on that more in the debate on the next group.
Amendment 9, in the name of my noble friend Lord Moylan, is designed to exempt small and medium sized-enterprises working to benefit the public from the scope of the Bill. Again, I am sympathetic to the objective of ensuring that the Bill does not impose undue burdens on small businesses, and particularly that it should not inhibit services from providing valuable content of public benefit, but I do not think it would be feasible to exempt service providers deemed to be
“working to benefit the public”.
I appreciate that this is a probing amendment, but the wording that my noble friend has alighted on highlights the difficulties of finding something suitably precise and not contestable. It would be challenging to identify which services should qualify for such an exemption.
Taking small services out of scope would significantly undermine the framework established by the Bill, as we know that many smaller services host illegal content and pose a threat to children. Again, let me reassure noble Lords that the Bill has been designed to avoid disproportionate or unnecessary regulatory burdens on small and low-risk services. It will not impose a disproportionate burden on services or impede users’ access to value content on smaller services.
Amendment 9A in the name of the noble Lord, Lord Knight of Weymouth, is designed to exempt “sector specific search services” from the scope of the Bill, as the noble Baroness, Lady Merron, explained. Again, I am sympathetic to the intention here of ensuring that the Bill does not impose a disproportionate burden on services, but this is another amendment that is not needed as it would exempt search services that may pose a significant risk of harm to children, or because of illegal content on them. The amendment aims to exempt specialised search services—that is, those that allow users to
“search for … products or services … in a particular sector”.
It would exempt specialised search services that could cause harm to children or host illegal content—for example, pornographic search services or commercial search services that could facilitate online fraud. I know the noble Lord would not want to see that.
The regulatory duties apply only where there is a significant risk of harm and the scope has been designed to exclude low-risk search services. The duties therefore do not apply to search engines that search a single database or website, for example those of many retailers or other commercial websites. Even where a search service is in scope, the duties on services are proportionate to the risk of harm that they pose to users, as well as to a company’s size and capacity. Low-risk services, for example, will have minimal duties. Ofcom will ensure that these services can quickly and easily comply by publishing risk profiles for low-risk services, enabling them easily to understand their risk levels and, if necessary, take steps to mitigate them.
The noble Lord, Lord McCrea, asked some questions about the 200 most popular pornographic websites. If I may, I will respond to the questions he posed, along with others that I am sure will come in the debate on the fifth group, when we debate the amendments in the names of the noble Lord, Lord Morrow, and the noble Baroness, Lady Ritchie of Downpatrick, because that will take us on to the same territory.
I hope that provides some assurance to my noble friend Lord Moylan, the noble Baroness, Lady Fox, and others, and that they will be willing not to press their amendments in this group.
My Lords, I have to start with a slightly unprofessional confession. I accepted the Bill team’s suggestion on how my amendments might be grouped after I had grouped them rather differently. The result is that I am not entirely clear why some of these groupings are quite as they are. As my noble friend the Minister said, my original idea of having Amendments 9, 10 and 11 together would perhaps have been better, as it would have allowed him to give a single response on Wikipedia. Amendments 10 and 11 in this group relate to Wikipedia and services like it.
I am, I hope, going to cause the Committee some relief as I do not intend to repeat remarks made in the previous group. The extent to which my noble friend wishes to amplify his comments in response to the previous group is entirely a matter for him, since he said he was reserving matter that he would like to bring forward but did not when commenting on the previous group. If I do not speak further on Amendments 10 and 11, it is not because I am not interested in what my noble friend the Minister might have to say on the topic of Wikipedia.
To keep this fairly brief, I turn to Amendment 26 on age verification. I think we have all agreed in the Chamber that we are united in wanting to see children kept safe. On page 10 of the Bill, in Clause 11(3), it states that there will be a duty to
“prevent children of any age from encountering”
this content—“prevent” them “encountering” is extremely strong. We do not prevent children encountering the possibility of buying cigarettes or encountering the possibility of being injured crossing the road, but we are to prevent children from these encounters. It is strongly urged in the clause—it is given as an example—that age verification will be required for that purpose.
Of course, age verification works only if it applies to everybody: one does not ask just the children to prove their age; one has to ask everybody online. Unlike when I go to the bar in a pub, my grey hair cannot be seen online. So this provision will almost certainly have to extend to the entire population. In Clause 11(3)(b), we have an obligation to protect. Clearly, the Government intend a difference between “prevent” and “protect”, or they would not have used two different verbs, so can my noble friend the Minister explain what is meant by the distinction between “prevent” and “protect”?
My amendment would remove Clause 11(3) completely. But it is, in essence, a probing amendment and what I want to hear from the Government, apart from how they interpret the difference between “prevent” and “protect”, is how they expect this duty to be carried out without having astonishingly annoying and deterring features built into every user-to-user platform and website, so that every time one goes on Wikipedia—in addition to dealing with the GDPR, accepting cookies and all the other nonsense we have to go through quite pointlessly—we then have to provide age verification of some sort.
What mechanism that might be, I do not know. I am sure that there are many mechanisms available for age verification. I do not wish to get into a technical discussion about what particular techniques might be used—I accept that there will be a range and that they will respond and adapt in the light of demand and technological advance—but I would like to know what my noble friend the Minister expects and how wide he thinks the obligation will be. Will it be on the entire population, as I suspect? Focusing on that amendment—and leaving the others to my noble friend the Minister to respond to as he sees fit—and raising those questions, I think that the Committee would like to know how the Government imagine that this provision will work. I beg to move.
My Lords, I will speak to the amendments in the name of the noble Lord, Lord Moylan, on moderation, which I think are more important than he has given himself credit for—they go more broadly than just Wikipedia.
There is a lot of emphasis on platform moderation, but the reality is that most moderation of online content is done by users, either individually or in groups, acting as groups in the space where they operate. The typical example, which many Members of this House have experienced, is when you post something and somebody asks, “Did you mean to post that?”, and you say, “Oh gosh, no”, and then delete it. A Member in the other place has recently experienced a rather high-profile example of that through the medium of the newspaper. On a much smaller scale, it is absolutely typical that people take down content every day, either because they regret it or, quite often, because their friends, families or communities tell them that it was unwise. That is the most effective form of moderation, because it is the way that people learn to change their behaviour online, as opposed to the experience of a platform removing content, which is often experienced as the big bad hand of the platform. The person does not learn to change their behaviour, so, in some cases, it can reinforce bad behaviour.
Community moderation, not just on Wikipedia but across the internet, is an enormous public good, and the last thing that we want to do in this legislation is to discourage people from doing it. In online spaces, that is often a volunteer activity: people give up their time to try to keep a space safe and within the guidelines they have set for that space. The noble Lord, Lord Moylan, has touched on a really important area: in the Bill, we must be absolutely clear to those volunteers that we will not create all kinds of new legal operations and liabilities on them. These are responsible people, so, if they are advised that they will incur all kinds of legal risk when trying to comply with the Online Safety Bill, they will stop doing the moderation—and then we will all suffer.
On age-gating, we will move to a series of amendments where we will discuss age assurance, but I will say at the outset, as a teaser to those longer debates, that I have sympathy with the points made by the noble Lord, Lord Moylan. He mentioned pubs—we often talk about real-world analogies. In most of the public spaces we enter in the real world, nobody does any ID checking or age checking; we take it on trust, unless and until you carry out an action, such as buying alcohol, which requires an age check.
It is legitimate to raise this question, because where we fall in this debate will depend on how we see public spaces. I see a general-purpose social network as equivalent to walking into a pub or a town square, so I do not expect to have my age and ID checked at the point at which I enter that public space. I might accept that my ID is checked at a certain point where I carry out various actions. Others will disagree and will say that the space should be checked as soon as you go into it—that is the boundary of the debate we will have across a few groups. As a liberal, I am certainly on the side that says that it is incumbent on the person wanting to impose the extra checks to justify them. We should not just assume that extra checks are cost-free and beneficial; they have a cost for us all, and it should be imposed only where there is a reasonable justification.
My Lords I am grateful to all noble Lords who have contributed to this slightly disjointed debate. I fully accept that there will be further opportunities to discuss age verification and related matters, so I shall say no more about that. I am grateful, in particular, to the noble Lord, Lord Allan of Hallam, for supplying the deficiency in my opening remarks about the importance of Amendments 10 and 11, and for explaining just how important that is too. I also thank the noble Lord, Lord Stevenson. It was good of him to say, in the open approach he took on the question of age, that there are issues still to be addressed. I do not think anybody feels that we have yet got this right and I think we are going to have to be very open in that discussion, when we get to it. That is also true about what the noble Lord, Lord Allan of Hallam, said: we have not yet got clarity as to where the age boundary is—I like his expression—for the public space. Where is the point at which, if checks are needed, those checks are to be applied? These are all matters to discuss and I hope noble Lords will forgive me if I do not address each individual contribution separately.
I would like to say something, I hope not unfairly or out of scope, about what was said by the noble Baronesses, Lady Finlay of Llandaff and Lady Kidron, when they used, for the first time this afternoon, the phrase “zero tolerance”, and, at the same time, talked about a risk-based approach. I have, from my own local government experience, a lot of experience of risk-based approaches taken in relation to things—very different, of course, from the internet—such as food safety, where local authorities grade restaurants and food shops and take enforcement action and supervisory action according to their assessment of the risk that those premises present. That is partly to do with their assessment of the management and partly to do with their experience of things that have gone wrong in the past. If you have been found with mouse droppings and you have had to clean up the shop, then you will be examined a great deal more frequently until the enforcement officers are happy; whereas if you are always very well run, you will get an inspection visit maybe only once a year. That is what a risk-based assessment consists of. The important thing to say is that it does not produce zero tolerance or zero outcomes.
I just want to make the point that I was talking about zero tolerance at the end of a ladder of tolerance, just to be clear. Letting a seven-year-old child into an 18-plus dating app or pornographic website is where the zero tolerance is—everything else is a ladder up to that.
I beg the noble Baroness’s pardon; I took that for granted. There are certain things—access to pornography, material encouraging self-harm and things of that sort—where one has to have zero tolerance, but not everything. I am sorry I took that for granted, so I fully accept that I should have made that more explicit in my remarks. Not everything is to be zero-toleranced, so to speak, but certain things are. However, that does not mean that they will not happen. One has to accept that there will be leakage around all this, just as some of the best-run restaurants that have been managed superbly for years will turn out, on occasion, to be the source of food poisoning. One has to accept that this is never going to be as tight as some of the advocates wanted, but with that, I hope I will be given leave to withdraw—
May I intervene, because I have also been named in the noble Lord’s response? My concern is about the most extreme, most violent, most harmful and destructive things. There are some terrible things posted online. You would not run an open meeting on how to mutilate a child, or how to stab somebody most effectively to do the most harm. It is at this extreme end that I cannot see anyone in society in the offline world promoting classes for any of these terrible activities. Therefore, there is a sense that exposure to these things is of no benefit but promotes intense harm. People who are particularly vulnerable at a formative age in their development should not be exposed to them, because they would not be exposed to them elsewhere. I am speaking personally, not for anybody else, but I stress that this is the level at which the tolerance should be set to zero because we set it to zero in the rest of our lives.
Everything the noble Baroness has said is absolutely right, and I completely agree with her. The point I simply want to make is that no form of risk-based assessment will achieve a zero-tolerance outcome, but—
I am so sorry, but may I offer just one final thought from the health sector? While the noble Lord is right that where there are human beings there will be error, there is a concept in health of the “never event”—that when that error occurs, we should not tolerate it, and we should expect the people involved in creating that error to do a deep inspection and review to understand how it occurred, because it is considered intolerable. I think the same exists in the digital world in a risk assessment framework, and it would be a mistake to ignore it.
My Lords, I am now going to attempt for the third time to beg the House’s leave to withdraw my amendment. I hope for the sake of us all, our dinner and the dinner break business, for which I see people assembling, that I will be granted that leave.
(1 year, 9 months ago)
Lords ChamberMy Lords, it is hard to think of something new to say at the end of such a long debate, but I am going to try. I am helped by the fact that I find myself, very unusually, somewhat out of harmony with the temper of the debate in your Lordships’ House over the course of this afternoon and evening. I rather felt at some points that I had wandered into a conference of medieval clerics trying to work out what measures to take to mitigate the harmful effects of the invention of moveable type.
In fact, it probably does require an almost religious level of faith to believe that the measures we are discussing are actually going to work, given what my noble friends Lord Camrose and Lord Sarfraz have said about the agility of the cyber world and the avidity of its users for content. Now, we all want to protect children, and if what had come forward had been a Bill which made it a criminal offence to display or allow to be displayed to children specified harmful content—with condign punishment—we would all, I am sure, have rallied around that and rejoiced. That is how we would have dealt with this 50 years ago. But instead we have this; this is not a short Bill doing that.
Let me make three brief points about the Bill in the time we have available. The first is a general one about public administration. We seem to be wedded to the notion that the way in which we should be running large parts of the life of the country is through regulators rather than law, and that the independence of those regulators must be sacrosanct. In a different part of your Lordships’ House, there has been discussion in the last few days of the Financial Services and Markets Bill in Committee. There, of course, we have been discussing the systemic failures of regulators—that is, the box ticking, the legalism, the regulatory capture and the emergence of the interests of the regulator and how they motivate them. None the less, we carry on giving more and more powers. Ofcom is going to be one of the largest regulators and one of the most important in our lives, and it is going to be wholly unaccountable. We are not going to be happy about that.
The second point I want to make is that the Bill represents a serious threat to freedom of speech. This is not contentious; the Front Bench admits it. The Minister says that it is going to strike the right balance. I have seen very little evidence in the Bill, or indeed in the course of the day’s debate, that that balance is going to be struck at all, let alone in what I might consider the right place—and what I might consider the right place might not be what others consider it to be. These are highly contentious issues; we will be hiving them off to an unaccountable regulator, in effect, at the end.
The third point that I want to make, because I think that I am possibly going to come in under my four minutes, is that I did vote Conservative at the last general election; I always have. But that does not mean that I subscribe to every jot and tittle of the manifesto; in particular, I do not think that I ever signed up to live in a country that was the safest place in the world to be on the internet. If I had, I would have moved to China already, where nothing is ever out of place on the internet. That is all I have to say, and I shall be supporting amendments that move in the general direction that I have indicated.
(2 years, 9 months ago)
Lords ChamberMy noble friend asks a good question, on which I will have to write to give him the answer and the full list, if he will forgive me for doing so.
I was just coming to the third reason why Amendment 3 is our preferred way of proceeding. The provisions inserted in this House would not achieve their objective of speeding up the pace of delivery. We must reiterate that releasing this money will not be immediate; indeed, we anticipate it taking several years for the £880 million to be released, and we do not expect any funds to be available for some time. Undercutting the consultation process would not materially affect the pace of that funding release. The Government have committed to launching the first public consultation on the purposes of the expanded English portion as soon as possible after Royal Assent. We anticipate that it could be live as soon as this summer and will be open for at least 12 weeks.
I repeat my commitment to write to my noble friend with the answer to his question, and I beg to move.
My Lords, before my noble friend sits down, does he agree that, especially in current circumstances, it would be wholly inappropriate to transfer funds from the TfL balance sheet by way of seizing what are alleged to be surplus Oyster assets, many of which are there because people, often from abroad, choose to leave assets on their Oyster card for when they visit London, which may be only once every few years?
My noble friend raises an interesting point that has not been made hitherto during the passage of the Bill, but I know that he speaks with considerable experience from his time working with TfL. If he allows me, I will write to him with further information about the implications for Oyster cards, which is a matter that has not been covered. It may have been covered in another place, but I have not seen whether that is the case.
(3 years, 4 months ago)
Lords ChamberI think the policy of consulting and getting an understanding of what would create a strong strategic future for a key public service broadcaster is entirely valid. The noble Lord is right that Channel 4 has been hugely successful in supporting our independent production sector. The Government are committed to seeing that continue, and we will take into account any impacts on that sector as we move forward.
My Lords, the Reuters Institute for the Study of Journalism at the University of Oxford has found that the UK has the lowest average local news topic access of any of the countries measured in its recent survey. If there were any reason for not privatising Channel 4, it would be to use the platform as the basis of a new local and regional television service. Has my noble friend considered that?
I agree with my noble friend that locally relevant television and, in particular, local news is a very important part of the UK’s public service broadcasting, which has been highlighted by the pandemic. These are issues which will form part of our strategic review of PSBs.
(3 years, 5 months ago)
Lords ChamberThe Government absolutely recognise the role that local authorities play, and, as the noble Earl is aware, they are important funders of DMOs. The review will look at the right funding structure for these organisations going forward.
My Lords, does my noble friend accept that regional transport authorities have an important role to play in welcoming and facilitating both national and international tourism? I am thinking, for example, of the sorts of visitor welcome centres that Transport for London has habitually maintained at major London rail termini. Will she take steps to ensure that funding is directed at keeping these in operation?
The Government recognise the role that regional transport authorities can play in providing information and assistance to visitors, as my noble friend has outlined, particularly when they co-ordinate that work with the DMOs. I have already mentioned the £25 billion provided to support the sector, which has been one of the worst hit; we have supported over 87% of businesses in this area.