(1 year, 7 months ago)
Lords ChamberMy Lords, before we continue this debate, I want to understand why we have changed the system so that we break part way through a group of amendments. I am sorry, but I think this is very poor. It is definitely a retrograde step. Why are we doing it? I have never experienced this before. I have sat here and waited for the amendment I have just spoken to. We have now had a break; it has broken the momentum of that group. It was even worse last week, because we broke for several days half way through the debate on an amendment. This is unheard of in my memory of 25 years in this House. Can my noble friend the Minister explain who made this decision, and how this has changed?
I have not had as long in your Lordships’ House, but this is not unprecedented, in my experience. These decisions are taken by the usual channels; I will certainly feed that back through my noble friend. One of the difficulties, of course, is that because there are no speaking limits on legislation and we do not know how many people want to speak on each amendment, the length of each group can be variable, so I think this is for the easier arrangement of dinner-break business. Also, for the dietary planning of those of us who speak on every group, it is useful to have some certainty, but I do appreciate my noble friend’s point.
Okay; I thank my noble friend for his response. However, I would just say that we never would have broken like that, before 7.30 pm. I will leave it at that, but I will have a word with the usual channels.
My Lords, the range of the amendments in this group indicates the importance of the Government’s approach to user verification and non-verified user duties. The way these duties have been designed seeks to strike a careful balance between empowering adults while safeguarding privacy and anonymity.
Amendments 38, 39, 139 and 140 have been tabled by my noble friend Lord Moylan. Amendments 38 and 39 seek to remove subsections (6) and (7) of the non-verified users’ duties. These place a duty on category 1 platforms to give adult users the option of preventing non-verified users interacting with their content, reducing the likelihood that a user sees content from non-verified users. I want to be clear that these duties do not require the removal of legal content from a service and do not impinge on free speech.
In addition, there are already existing duties in the Bill to safeguard legitimate online debate. For example, category 1 services will be required to assess the impact on free expression of their safety policies, including the impact of their user empowerment tools. Removing subsections (6) and (7) of Clause 12 would undermine the Bill’s protection for adult users of category 1 services, especially the most vulnerable. It would be entirely at the service provider’s discretion to offer users the ability to minimise their exposure to anonymous and abusive users, sometimes known as trolls. In addition, instead of mandating that users verify their identity, the Bill gives adults the choice. On that basis, I am confident that the Bill already achieves the effect of Amendment 139.
Amendment 140 seeks to reduce the amount of personal data transacted as part of the verification process. Under subsection (3) of Clause 57, however, providers will be required to explain in their terms of service how the verification process works, empowering users to make an informed choice about whether they wish to verify their identity. In addition, the Bill does not alter the UK’s existing data protection laws, which provide people with specific rights and protections in relation to the processing of their personal data. Ofcom’s guidance in this area will reflect existing laws, ensuring that users’ data is protected where personal data is processed. I hope my noble friend will therefore be reassured that these duties reaffirm the concept of choice and uphold the importance of protecting personal data.
While I am speaking to the questions raised by my noble friend, I turn to those he asked about Wikipedia. I have nothing further to add to the comments I made previously, not least that it is impossible to pre-empt the assessments that will be made of which services fall into which category. Of course, assessments will be made at the time, based on what the services do at the time of the assessment, so if he will forgive me, I will not be drawn on particular services.
To speak in more general terms, category 1 services are those with the largest reach and the greatest influence over public discourse. The Bill sets out a clear process for determining category 1 providers, based on thresholds set by the Secretary of State in secondary legislation following advice from Ofcom. That is to ensure that the process is objective and evidence based. To deliver this advice, Ofcom will undertake research into the relationship between how quickly, easily and widely user-generated content is disseminated by that service, the number of users and functionalities it has and other relevant characteristics and factors.
Will my noble friend at least confirm what he said previously: namely, that it is the Government’s view—or at least his view—that Wikipedia will not qualify as a category 1 service? Those were the words I heard him use at the Dispatch Box.
That is my view, on the current state of play, but I cannot pre-empt an assessment made at a point in the future, particularly if services change. I stand by what I said previously, but I hope my noble friend will understand if I do not elaborate further on this, at the risk of undermining the reassurance I might have given him previously.
Amendments 40, 41, 141 and 303 have been tabled by the noble Lord, Lord Stevenson of Balmacara, and, as noble Lords have noted, I have added my name to Amendment 40. I am pleased to say that the Government are content to accept it. The noble Baroness, Lady Merron, should not minimise this, because it involves splitting an infinitive, which I am loath to do. If this is a statement of intent, I have let that one go, in the spirit of consensus. Amendment 40 amends Clause 12(7) to ensure that the tools which will allow adult users to filter out content from non-verified users are effective and I am pleased to add my name to it.
Amendment 41 seeks to make it so that users can see whether another user is verified or not. I am afraid we are not minded to accept it. While I appreciate the intent, forcing users to show whether they are verified or not may have unintended consequences for those who are unable to verify themselves for perfectly legitimate reasons. This risks creating a two-tier system online. Users will still be able to set a preference to reduce their interaction with non-verified users without making this change.
Amendment 141 seeks to prescribe a set of principles and standards in Ofcom’s guidance on user verification. It is, however, important that Ofcom has discretion to determine, in consultation with relevant persons, which principles will have the best outcomes for users, while ensuring compliance with the duties. Further areas of the Bill also address several issues raised in this amendment. For example, all companies in scope will have a specific legal duty to have effective user reporting and redress mechanisms.
Existing laws also ensure that Ofcom’s guidance will reflect high standards. For example, it is a general duty of Ofcom under Section 3 of the Communications Act 2003 to further the interests of consumers, including by promoting competition. This amendment would, in parts, duplicate existing duties and undermine Ofcom’s independence to set standards on areas it deems relevant after consultation with expert groups.
Amendment 303 would add a definition of user identity verification. The definition it proposes would result in users having to display their real name online if they decide to verify themselves. In answer to the noble Baroness’s question, the current requirements do not specify that users must display their real name. The amendment would have potential safety implications for vulnerable users, for example victims and survivors of domestic abuse, whistleblowers and others of whom noble Lords have given examples in their contributions. The proposed definition would also create reliance on official forms of identification. That would be contrary to the existing approach in Clause 57 which specifically sets out that verification need not require such forms of documentation.
The noble Baroness, Lady Kidron, talked about paid-for verification schemes. The user identity verification provisions were brought in to ensure that adult users of the largest services can verify their identity if they so wish. These provisions are different from the blue tick schemes and others currently in place, which focus on a user’s status rather than verifying their identity. Clause 57 specifically sets out that providers of category 1 services will be required to offer all adult users the option to verify their identity. Ofcom will provide guidance for user identity verification to assist providers in complying with these duties. In doing so, it will consult groups that represent the interests of vulnerable adult users. In setting out recommendations about user verification, Ofcom must have particular regard to ensuring that providers of category 1 services offer users a form of identity verification that is likely to be available to vulnerable adult users. Ofcom will also be subject to the public sector equality duty, so it will need to take into account the ways in which people with certain characteristics may be affected when it performs this and all its duties under the Bill.
A narrow definition of identity verification could limit the range of measures that service providers might offer their users in the future. Under the current approach, Ofcom will produce and publish guidance on identity verification after consulting those with technical expertise and groups which represent the interests of vulnerable adult users.
I am sorry to interrupt the noble Lord. Is the answer to my question that the blue tick and the current Meta system will not be considered as verification under the terms of the Bill? Is that the implication of what he said?
Yes. The blue tick is certainly not identity verification. I will write to confirm on Meta, but they are separate and, as the example of blue ticks and Twitter shows, a changing feast. That is why I am talking in general terms about the approach, so as not to rely too much on examples that are changing even in the course of this Committee.
Government Amendment 43A stands in my name. This clarifies that “non-verified user” refers to users whether they are based in the UK or elsewhere. This ensures that, if a UK user decides he or she no longer wishes to interact with non-verified users, this will apply regardless of where they are based.
Finally, Amendment 106 in the name of my noble friend Lady Buscombe would make an addition to the online safety objectives for regulated user-to-user services. It would amend them to make it clear that one of the Bill’s objectives is to protect people from communications offences committed by anonymous users.
The Bill already imposes duties on services to tackle illegal content. Those duties apply across all areas of a service, including the way it is designed and operated. Platforms will be required to take measures—for instance, changing the design of functionalities, algorithms, and other features such as anonymity—to tackle illegal content.
Ofcom is also required to ensure that user-to-user services are designed and operated to protect people from harm, including with regard to functionalities and other features relating to the operation of their service. This will likely include the use of anonymous accounts to commit offences in the scope of the Bill. My noble friend’s amendment is therefore not needed. I hope she will be satisfied not to press it, along with the other noble Lords who have amendments in this group.
My Lords, I would like to say that that was a rewarding and fulfilling debate in which everyone heard very much what they wanted to hear from my noble friend the Minister. I am afraid I cannot say that. I think it has been one of the most frustrating debates I have been involved in since I came into your Lordships’ House. However, it gave us an opportunity to admire the loftiness of manner that the noble Lord, Lord Clement-Jones, brought to dismissing my concerns about Wikipedia—that I was really just overreading the whole thing and that I should not be too bothered with words as they appear in the Bill because the noble Lord thinks that Wikipedia is rather a good thing and why is it not happy with that as a level of assurance?
I would like to think that the Minister had dealt with the matter in the way that I hoped he would, but I do thin, if I may say so, that it is vaguely irresponsible to come to the Dispatch Box and say, “I don’t think Wikipedia will qualify as a category 1 service”, and then refuse to say whether it will or will not and take refuge in the process the Bill sets up, when at least one Member of the House of Lords, and possibly a second in the shape of the noble Lord, Lord Clement-Jones, would like to know the answer to the question. I see a Minister from the business department sitting on the Front Bench with my noble friend. This is a bit like throwing a hand grenade into a business headquarters, walking away and saying, “It was nothing to do with me”. You have to imagine what the position is like for the business.
We had a very important amendment from my noble friend Lady Buscombe. I think we all sympathise with the type of abuse that she is talking about—not only its personal effects but its deliberate business effects, the deliberate attempt to destroy businesses. I say only that my reading of her Amendment 106 is that it seeks to impose on Ofcom an objective to prevent harm, essentially, arising from offences under Clauses 160 and 162 of the Bill committed by unverified or anonymous users. Surely what she would want to say is that, irrespective of verification and anonymity, one would want action taken against this sort of deliberate attempt to undermine and destroy businesses. While I have every sympathy with her amendment, I am not entirely sure that it relates to the question of anonymity and verification.
Apart from that, there were in a sense two debates going on in parallel in our deliberations. One was to do with anonymity. On that question, I think the noble Lord, Lord Clement-Jones, put the matter very well: in the end, you have to come down on one side or the other. My personal view, with some reluctance, is that I have come down on the same side as the Government, the noble Lord and others. I think we should not ban anonymity because there are costs and risks to doing so, however satisfying it would be to be able to expose and sue some of the people who say terrible and untrue things about one another on social media.
The more important debate was not about anonymity as such but about verification. We had the following questions, which I am afraid I do not think were satisfactorily answered. What is verification? What does it mean? Can we define what verification is? Is it too expensive? Implicitly, should it be available for free? Is there an obligation for it to be free or do the paid-for services count, and what happens if they are so expensive that one cannot reasonably afford them? Is it real, in the sense that the verification processes devised by the various platforms genuinely provide verification? Various other questions like that came up but I do not think that any of them was answered.
I hate to say this as it sounds a little harsh about a Government whom I so ardently support, but the truth is that the triple shield, also referred to as a three-legged stool in our debate, was hastily cobbled together to make up for the absence of legal but harmful, but it is wonky; it is not working, it is full of holes and it is not fit for purpose. Whatever the Minister says today, there has to be a rethink before he comes back to discuss these matters at the next stage of the Bill. In the meantime, I beg leave to withdraw my amendment.
Lawyers—don’t you love them? How on earth are we supposed to unscramble that at this time of night? It was good to have my kinsman, the noble and learned Lord, Lord Hope, back in our debates. We were remarking only a few days ago that we had not seen enough lawyers in the House in these debates. One appears, and light appears. It is a marvellous experience.
I thank the Committee for listening to my earlier introductory remarks; I hope they helped to untangle some of the issues. The noble Lord, Lord Black, made it clear that the press are happy with what is in the current draft. There could be some changes, and we have heard a number of examples of ways in which one might either top or tail what there is.
There was one question that perhaps he could have come back on, and maybe he will, as I have raised it separately with the department before. I agree with a lot of what he said, but it applies to a lot more than just news publishers. Quality journalism more generally enhances and restores our faith in public services in so many ways. Why is it only the news? Is there a way in which we could broaden that? If there is not this time round, perhaps that is something we need to pick up later.
As the noble Lord, Lord Clement-Jones, has said, the noble Viscount, Lord Colville, made a very strong and clear case for trying to think again about what journalism does in the public realm and making sure that the Bill at least carries that forward, even if it does not deal with some of the issues that he raised.
We have had a number of other good contributions about how to capture some of the good ideas that were flying around in this debate and keep them in the foreground so that the Bill is enhanced. But I think it is time that the Minister gave us his answers.
I join noble Lords who have sent good wishes for a speedy recovery to the noble Baroness, Lady Featherstone.
Amendments 46, 47 and 64, in the name of my noble friend Lady Stowell of Beeston, seek to require platforms to assess the risk of, and set terms for, content currently set out in Clause 12. Additionally, the amendments seek to place duties on services to assess risks to freedom of expression resulting from user empowerment tools. Category 1 platforms are already required to assess the impact on free expression of their safety policies, including user empowerment tools; to keep that assessment up to date; to publish it; and to demonstrate the positive steps they have taken in response to the impact assessment in a publicly available statement.
Amendments 48 and 100, in the name of the noble Lord, Lord Stevenson, seek to introduce a stand-alone duty on category 1 services to protect freedom of expression, with an accompanying code of practice. Amendments 49, 50, 53A, 61 and 156, in the name of the noble Baroness, Lady Fox, seek to amend the Bill’s Clause 17 and Clause 18 duties and clarify duties on content of democratic importance.
All in-scope services must already consider and implement safeguards for freedom of expression when fulfilling their duties. Category 1 services will need to be clear what content is acceptable on their services and how they will treat it, including when removing or restricting access to it, and that they will enforce the rules consistently. In setting these terms of service, they must adopt clear policies designed to protect journalistic and democratic content. That will ensure that the most important types of content benefit from additional protections while guarding against the arbitrary removal of any content. Users will be able to access effective appeal mechanisms if content is unfairly removed. That marks a considerable improvement on the status quo.
Requiring all user-to-user services to justify why they are removing or restricting each individual piece of content, as Amendment 53A would do, would be disproportionately burdensome on companies, particularly small and medium-sized ones. It would also duplicate some of the provisions I have previously outlined. Separately, as private entities, service providers have their own freedom of expression rights. This means that platforms are free to decide what content should or should not be on their website, within the bounds of the law. The Bill should not mandate providers to carry or to remove certain types of speech or content. Accordingly, we do not think it would be appropriate to require providers to ensure that free speech is not infringed, as suggested in Amendment 48.
Why would it not be possible for us to try to define what the public interest might be, and not leave it to the platforms to do so?
I ask the noble Viscount to bear with me. I will come on to this a bit later. I do not think it is for category 1 platforms to do so.
We have introduced Clause 15 to reduce the powers that the major technology companies have over what journalism is made available to UK users. Accordingly, Clause 15 requires category 1 providers to set clear terms of service which explain how they take the importance of journalistic content into account when making their moderation decisions. These duties will not stop platforms removing journalistic content. Platforms have the flexibility to set their own journalism policies, but they must enforce them consistently. They will not be able to remove journalistic content arbitrarily. This will ensure that platforms give all users of journalism due process when making content moderation decisions. Amendment 51 would mean that, where platforms subjectively reached a decision that journalism was not conducive to the public good, they would not have to give it due process. Platforms could continue to treat important journalistic content arbitrarily where they decided that this content was not in the public interest of the UK.
In his first remarks on this group the noble Lord, Lord Stevenson, engaged with the question of how companies will identify content of democratic importance, which is content that seeks to contribute to democratic political debate in the UK at a national and local level. It will be broad enough to cover all political debates, including grass-roots campaigns and smaller parties. While platforms will have some discretion about what their policies in this area are, the policies will need to ensure that platforms are balancing the importance of protecting democratic content with their safety duties. For example, platforms will need to consider whether the public interest in seeing some types of content outweighs the potential harm it could cause. This will require companies to set out in their terms of service how they will treat different types of content and the systems and processes they have in place to protect such content.
Amendments 57 and 62, in the name of my noble friend Lord Kamall, seek to impose new duties on companies to protect a broader range of users’ rights, as well as to pay particular attention to the freedom of expression of users with protected characteristics. As previously set out, services will have duties to safeguard the freedom of expression of all users, regardless of their characteristics. Moreover, UK providers have existing duties under the Equality Act 2010 not to discriminate against people with characteristics which are protected in that Act. Given the range of rights included in Amendment 57, it is not clear what this would require from service providers in practice, and their relevance to service providers would likely vary between different rights.
Amendment 60, in the name of the noble Lord, Lord Clement-Jones, and Amendment 88, in the name of the noble Lord, Lord Stevenson, probe whether references to privacy law in Clauses 18 and 28 include Article 8 of the European Convention on Human Rights. That convention applies to member states which are signatories. Article 8(1) requires signatories to ensure the right to respect for private and family life, home and correspondence, subject to limited derogations that must be in accordance with the law and necessary in a democratic society. The obligations flowing from Article 8 do not apply to individuals or to private companies and it would not make sense for these obligations to be applied in this way, given that states which are signatories will need to decide under Article 8(2) which restrictions on the Article 8(1) right they need to impose. It would not be appropriate or possible for private companies to make decisions on such restrictions.
Providers will, however, need to comply with all UK statutory and common-law provisions relating to privacy, and must therefore implement safeguards for user privacy when meeting their safety duties. More broadly, Ofcom is bound by the Human Rights Act 1998 and must therefore uphold Article 8 of the European Convention on Human Rights when implementing the Bill’s regime.
It is so complicated that the Minister is almost enticing me to stand up and ask about it. Let us just get that right: the reference to the Article 8 powers exists and applies to those bodies in the UK to which such equivalent legislation applies, so that ties us into Ofcom. Companies cannot be affected by it because it is a public duty, not a private duty, but am I then allowed to walk all the way around the circle? At the end, can Ofcom look back at the companies to establish whether, in Ofcom’s eyes, its requirements in relation to its obligations under Article 8 have or have not taken place? It is a sort of transparent, backward-reflecting view rather than a proactive proposition. That seems a complicated way of saying, “Why don’t you behave in accordance with Article 8?”
Yes, Ofcom, which is bound by it through the Human Rights Act 1998, can ask those questions and make that assessment of the companies, but it would not be right for private companies to be bound by something to which it is not appropriate for companies to be signatories. Ofcom will be looking at these questions but the duty rests on it, as bound by the Human Rights Act.
It is late at night and this is slightly tedious, but in the worst of all possible circumstances, Ofcom would be looking at what happened over the last year in relation to its codes of practice and assertions about a particular company. Ofcom is then in trouble because it has not discharged its Article 8 obligations, so who gets to exercise a whip on whom? Sorry, whips are probably the wrong things to use, but you see where I am coming from. All that is left is for the Secretary of State, but probably it would effectively be Parliament, to say to Ofcom, “You’ve failed”. That does not seem a very satisfactory solution.
Platforms will be guided by Ofcom in taking measures to comply with their duties which are recommended in Ofcom’s codes, and which contain safeguards for privacy, including ones based on the European Convention on Human Rights and the rights therein. Paragraph 10(2)(b) of Schedule 4 requires Ofcom to ensure that measures, which it describes in the code of practice, are designed in light of the importance of protecting the privacy of users. Clause 42(2) and (3) provides that platforms will be treated as complying with the privacy duties set out at Clause 18(2) and Clause 28(2), if they take the recommended measures that Ofcom sets out in the codes.
It worked. In seriousness, we will both consult the record and, if the noble Lord wants more, I am very happy to set it out in writing.
Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead, seeks to clarify that “freedom of expression” in Clause 18 refers to the
“freedom to impart ideas, opinions or information”,
as referred to in Article 10 of the European Convention on Human Rights. I think I too have been guilty of using the phrases “freedom of speech” and “freedom of expression” as though they were interchangeable. Freedom of expression, within the law, is intended to encompass all the freedom of expression rights arising from UK law, including under common law. The rights to freedom of expression under Article 10 of the European Convention on Human Rights include both the rights to impart ideas, opinions and information, but also the right to receive such ideas, opinions and information. Any revised definition of freedom of expression to be included in the Bill should refer to both aspects of the Article 10 definition, given the importance for both children and adults of receiving information via the internet. We recognise the importance of clarity in relation to the duties set out in Clauses 18 and 28, and we are very grateful to the noble and learned Lord for proposing this amendment, and for the experience he brings to bear on behalf of the Constitution Committee of your Lordships’ House. The Higher Education (Freedom of Speech) Bill and the Online Safety Bill serve very different purposes, but I am happy to say that the Bill team and I will consider this amendment closely between now and Report.
Amendments 101, 102, 109, 112, 116, 121, 191 and 220, in the name of my noble friend Lord Moylan, seek to require Ofcom to have special regard to the importance of protecting freedom of expression when exercising its enforcement duties, and when drafting or amending codes of practice or guidance. Ofcom must already ensure that it protects freedom of expression when overseeing the Bill, because it is bound by the Human Rights Act, as I say. It also has specific duties to ensure that it is clear about how it is protecting freedom of expression when exercising its duties, including when developing codes of practice.
My noble friend’s Amendment 294 seeks to remove “psychological” from the definition of harm in the Bill. It is worth being clear that the definition of harm is used in the Bill as part of the illegal and child safety duties. There is no definition of harm, psychological or otherwise, with regard to adults, given that the definition of content which is harmful to adults was removed from the Bill in another place. With regard to children, I agree with the points made by the noble Baroness, Lady Kidron. It is important that psychological harm is captured in the Bill’s child safety duties, given the significant impact that such content can have on young minds.
I invite my noble friend and others not to press their amendments in this group.