(1 year, 6 months ago)
Lords ChamberJust to clarify, in a way we have reduced this debate to whether the default position should be on or off, although in fact that is only one aspect of this. My concern, and what I maybe spent too long talking about, is what happens if we turn the toggles to “on”. The assumption we keep making is that once they are on, we are safe. The difficulty is that the categories of what is filtered out after turning them on are not necessarily what the user thinks they are. I am simply asking how you get around that; otherwise, we think it is too easy—turn it on or off; press the button. Is it not problematic for us all if, in thinking you are going to stop seeing hate, hate turns out actually to be legitimate and interesting political ideas?
As ever, the noble Baroness is an important voice in bursting our bubble in the Chamber. I continue to respect her for that. It will not be perfect; there is no perfect answer to all this. I am siding with safety and caution rather than a bit of a free-for-all. Sometimes there might be overcaution and aspects of debate where the platforms, the regulator, the media, and discussion and debate in this Chamber would say, “The toggles have got it wrong”, but we just have to make a judgment about which side we are on. That is what I am looking forward to hearing from the Minister.
These amendments are supported on all sides and by a long list of organisations, as listed by the noble Baroness, Lady Morgan, and the noble Lord, Lord Clement-Jones. The Minister has not conceded very much at all so far to this Committee. We have heard compelling speeches, such as those from the noble Baroness, Lady Parminter, that have reinforced my sense that he needs to give in on this when we come to Report.
I will also speak to my Amendment 38A. I pay tribute to John Penrose MP, who was mentioned by the noble Baroness, Lady Harding, and his work in raising concerns about misinformation and in stimulating discussion outside the Chambers among parliamentarians and others. Following discussions with him and others in the other place, I propose that users of social media should have the option to filter out content the provenance of which cannot be authenticated.
As we know, social media platforms are often awash with content that is unverified, misleading or downright false. This can be particularly problematic when it comes to sensitive or controversial topics such as elections, health or public safety. In these instances, it can be difficult for users to know whether the information presented to them is accurate. Many noble Lords will be familiar with the deep-fake photograph of the Pope in a white puffa jacket that recently went viral, or the use of imagery for propaganda purposes following the Russian invasion of Ukraine.
The Content Authenticity Initiative has created an open industry standard for content authenticity and provenance. Right now, tools such as Adobe Photoshop allow users to turn on content credentials to securely attach provenance data to images and any edits then made to those images. That technology has now been adopted by camera manufacturers such as Leica and Nikon, so the technology is there to do some of this to help give us some reassurance.
Amendment 38A would allow users to filter out unverified content and is designed to flag posts or articles that do not come from a reliable source or have not been independently verified by a reputable third party. Users could then choose to ignore or filter out such content, ensuring that they are exposed only to information that has been vetted and verified. This would not only help users to make more informed decisions but help to combat the spread of false information on social media platforms. By giving users the power to filter out unverified content, we can help to ensure that social media platforms are not used to spread harmful disinformation or misinformation.
Amendments 42 and 45, in the name of my noble friend Lord Stevenson, are good and straightforward improvements to the Bill, suggesting that Clause 12(11) should include all the protected characteristics in the Equality Act 2010. I listened closely to the argument of the noble Baroness, Lady Buscombe, about listing the various characteristics, but I think it makes much more sense to do what my noble friend is suggesting and just reference the Equality Act. If we chose separately to amend the Equality Act to change protected characteristics, that change would then flow through into this Bill. That seems to be a much more sensible and pragmatic way to proceed. Similarly, in Amendment 45 my noble friend is suggesting that consideration should be given by the Secretary of State as to whether any of these tools should also be available to child users, and we have heard good arguments in favour of that when it comes to dealing with the cliff-edge effect, as described by the noble Baroness, Lady Kidron.
Lastly—and, perhaps the Committee will forgive me, slightly bizarrely—Amendment 283ZA, in the name of the noble Baroness, Lady Fox, to Clause 170(7) in Part 11 invites us to consider, among other things, whether a bot can act reasonably. Given that self-coding bots are now technically possible, I would be fascinated to hear the Minister’s view on whether an autonomous bot can have reason. I asked ChatGPT this question. For the help of the Committee, it replied:
“As an artificial intelligence language model, I do not have consciousness or emotions, and therefore, I do not have a capacity for ‘reason’ in the way that humans understand it. However, I am programmed to use advanced algorithms that allow me to process and analyze large amounts of data, recognize patterns, and provide reasoned responses based on the information available to me. This allows me to simulate a form of reasoning, but it is important to note that it is not the same as human reasoning, as I do not have subjective experiences or personal biases. Ultimately, my abilities are limited to the algorithms and data that have been programmed into my system, and I cannot generate my own subjective experiences or judgments.”
That is the view of the algorithm as to whether or not bots can have reason. I look forward to the Minister’s response.
(1 year, 6 months ago)
Lords ChamberMy Lords, that last exchange was incredibly helpful. I am grateful to the noble Lord, Lord Allan, for what he just said and the way in which he introduced this group. I want to make only a few brief remarks.
I have put my name to two amendments in this group: Amendment 202 in the name of the noble Lord, Lord Stevenson, which seeks to ensure that Ofcom will be subject to the same kind of requirements and controls as exist under the Regulation of Investigatory Powers Act before issuing a technology notice
“to a regulated service which offers private messaging with end-to-end encryption”;
and Amendment 285, also in the name of the noble Lord, Lord Stevenson, and that of the noble Lord, Lord Clement-Jones. This amendment would make sure that no social media platforms or private end-to-end messaging services have an obligation generally to monitor what is going on across their platforms. When I looked at this group and the various amendments in it, those were the two issues that I thought were critical. These two amendments seemed to approach them in the most simple and straightforward manner.
Like other noble Lords, my main concern is that I do not want search and social media platforms to have an obligation to become what we might describe as thought police. I do not want private messaging firms to start collecting and storing the content of our messages so that they have what we say ready to hand over in case they are required to do so. What the noble Lord, Lord Allan, just said is an important point to emphasise. Some of us heard from senior representatives from WhatsApp a few weeks ago. I was quite surprised to learn how much they are doing in this area to co-operate with the authorities; I felt very reassured to learn about that. I in no way want to discourage that because they are doing an awful amount of good stuff.
Basically, this is such a sensitive matter, as has been said, that it is important for the Government to be clear what their policy intentions are by being clear in the Bill. If they do not intend to require general monitoring that needs to be made explicit. It is also important that, if Ofcom is to be given new investigatory powers or powers to insist on things through these technology notices, it is clear that its powers do not go beyond those that are already set out in law. As we have heard from noble Lords, there is widespread concern about this matter not just from the social media platforms and search engines themselves but from news organisations, journalists and those lobby groups that often speak out on liberty-type matters. These topics go across a wide range of interest groups, so I very much hope that my noble friend the Minister will be able to respond constructively and open-mindedly on them.
My Lords, I was not intending to intervene on this group because my noble friend Lord Stevenson will address these amendments in their entirety, but listening in to this public conversation about this group of amendments has stimulated a question that I want both to put on the record and to give the Minister time to reflect on.
If we get the issues of privacy and encrypted messaging wrong, it will push more people into using VPN—virtual private network—services. I went into the app store on my phone to search for VPN software. There is nothing wrong with such software—our parliamentary devices have it to do general monitoring and make sure that we do not use services such as TikTok—but it is used to circumnavigate much of the regulatory regime that we are seeking to put together through this Bill. When I search for VPNs in the app store, the first one that comes up that is not a sponsored, promoted advertisement has an advisory age limit of four years old. Several of them are the same; some are 17-plus but most are four-plus. Clearly, the app promotes itself very much on the basis that it offers privacy and anonymity, which are the key features of a VPN. However, a review of it says, “I wouldn’t recommend people use this because it turns out that this company sends all its users’ data to China so that it can do general monitoring”.
I am not sure how VPNs are being addressed by the Bill, even though they seem really pertinent to the issues of privacy and encryption. I would be interested to hear whether—and if we are, how—we are bringing the regulation and misuse of VPNs into scope for regulation by Ofcom.
My Lords, I would like to say something very quickly on VPN. I had a discussion with some teenagers recently, who were all prepared for this Bill—I was quite surprised that they knew a lot about it. They said, “Don’t worry, we’ve worked out how to get around it. Have you heard of VPN?” It reminded me of a visit to China, where I asked a group of students how they dealt with censorship and not being able to google. They said, “Don’t worry about it”, and showed me VPN. It is right that we draw attention to that. There is a danger of inadvertently forcing people on to the unregulated dark web and into areas that we might not imagine. That is why we have to be careful and proportionate in our response.
(2 years, 4 months ago)
Lords ChamberJust to note, a lot of the charitable organisations and so on are making money. I am not suggesting that because they are making money, they are evil, but I do not think that it quite works in this instance because the phrase “commercial sensitivity” is used by organisations which are not big businesses going in; they are small and socially worthy, but they are also commercial. Let me tell you, a lot of them are making quite a lot of money, even if they are doing it with the best intentions. That is not really the point.
While we are at it, I declare my interest that I work with a company called EVERFI, which does some of this work, but it liaises with money-making commercial organisations to provide resources at no charge for teachers. Some of those, for example, relate to careers, which is part of this group of amendments. There are excellent science employers or computer gaming companies, for example, which are trying to help create the learning that will mean that people from all sorts of backgrounds are more inclined, readier and more confident to think that they could work in those industries. I would not want anything that the noble Baroness is saying to curtail that sort of important learning resource.