(1 year, 2 months ago)
Lords ChamberWill my noble friend draw attention to the part of Clause 122 that says that Ofcom cannot issue a requirement which is not technically feasible, as he has just said? That does not appear in the text of the clause, and it creates a potential conflict. Even if the requirement is not technically feasible—or, at least, if the platform claims that it is not—Ofcom’s power to require it is not mitigated by the clause. It still has the power, which it can exercise, and it can presumably take some form of enforcement action if it decides that the company is not being wholly open or honest. The technical feasibility is not built into the clause, but my noble friend has just added it, as with quite a lot of other stuff in the Bill.
It has to meet minimum standards of accuracy and must have privacy safeguards in place. The clause talks about those in a positive sense, which sets out the expectation. I am happy to make clear, as I have, what that means: if the appropriate technology does not exist that meets these requirements, then Ofcom will not be able to use Clause 122 to require its use. I hope that that satisfies my noble friend.
My Lords, I want to thank the Minister and other noble colleagues for such kind words. I really appreciate it.
I want to say very little. It has been an absolute privilege to work with people across both Houses on this. It is not every day that one keeps the faith in the system, but this has been a great pleasure. In these few moments that I am standing, I want to pay tribute to the bereaved parents, the children’s coalition, the NSPCC, my colleagues at 5Rights, Barnardo’s, and the other people out there who listen and care passionately that we get this right. I am not going to go through what we got right and wrong, but I think we got more right than we got wrong, and I invite the Minister to sit with me on Monday in the Gallery to make sure that those last little bits go right—because I will be there. I also remind the House that we have some work in the data Bill vis-à-vis the bereaved parents.
In all the thanks—and I really feel that I have had such tremendous support on my area of this Bill—I pay tribute to the noble Baroness, Lady Benjamin. She was there before many people were and suffered cruelly in the legislative system. Our big job now is to support Ofcom, hold it to account and help it in its task, because that is Herculean. I really thank everyone who has supported me through this.
My Lords, I am sure that your Lordships would not want the Bill to pass without hearing some squeak of protest and dissent from those of us who have spent so many days and weeks arguing for the interests of privacy and free speech, to which the Bill remains a very serious and major threat.
Before I come to those remarks, I associate myself with what other noble Lords have said about what a privilege it has been, for me personally and for many of us, to participate over so many days and weeks in what has been the House of Lords at its deliberative best. I almost wrote down that we have conducted ourselves like an academic seminar, but when you think about what most academic seminars are like—with endless PowerPoint slides and people shuttling around, and no spontaneity whatever—we exceeded that by far. The conversational tone that we had in the discussions, and the way in which people who did not agree were able to engage—indeed, friendships were made—meant that the whole thing was done with a great deal of respect, even for those of us who were in the small minority. At this point, I should perhaps say on behalf of the noble Baroness, Lady Fox of Buckley, who participated fully in all stages of the Bill, that she deeply regrets that she cannot be in her place today.
I am not going to single out anybody except for one person. I made the rather frivolous proposal in Committee that all our debates should begin with the noble Lord, Lord Allan of Hallam; we learned so much from every contribution he made that he really should have kicked them all off. We would all have been a great deal more intelligent about what we were saying, and understood it better, had we heard what he had to say. I certainly have learned a great deal from him, and that was very good.
I will raise two issues only that remain outstanding and are not assuaged by the very odd remarks made by my noble friend as he moved the Third Reading. The first concerns encryption. The fact of the matter is that everybody knows that you cannot do what Ofcom is empowered by the Bill to do without breaching end-to-end encryption. It is as simple as that. My noble friend may say that that is not the Government’s intention and that it cannot be forced to do it if the technology is not there. None of that is in the Bill, by the way. He may say that at the Dispatch Box but it does not address the fact that end-to-end encryption will be breached if Ofcom finds a way of doing what the Bill empowers it to do, so why have we empowered it to do that? How do we envisage that Ofcom will reconcile those circumstances where platforms say that they have given their best endeavours to doing something and Ofcom simply does not believe that they have? Of course, it might end up in the courts, but the crucial point is that that decision, which affects so many people—and so many people nowadays regard it as a right to have privacy in their communications—might be made by Ofcom or by the courts but will not be made in this Parliament. We have given it away to an unaccountable process and democracy has been taken out of it. In my view, that is a great shame.
I come back to my second issue—I will not be very long. I constantly ask about Wikipedia. Is Wikipedia in scope of the Bill? If it is, is it going to have to do prior checking of what is posted? That would destroy its business model and make many minority language sites—I instanced Welsh—totally unviable. My noble friend said at the Dispatch Box that, in his opinion, Wikipedia was not going to be in scope of the Bill. But when I asked why we could not put that in the Bill, he said it was not for him to decide whether it was in scope and that the Government had set up this wonderful structure whereby Ofcom will tell us whether it is—almost without appeal, and again without any real democratic scrutiny. Oh yes, and we might have a Select Committee, which might write a very good, highly regarded report, which might be debated some time within the ensuing 12 months on the Floor of your Lordships’ House. However, we will have no say in that matter; we have given it away.
I said at an earlier stage of the Bill that, for privacy and censorship, this represents the closest thing to a move back to the Lord Chamberlain and Lady Chatterley’s Lover that you could imagine but applied to the internet. That is bad, but what is almost worse is this bizarre governance structure where decisions of crucial political sensitivity are being outsourced to an unaccountable regulator. I am very sad to say that I think that, at first contact with reality, a large part of this is going to collapse, and with it a lot of good will be lost.
(1 year, 3 months ago)
Lords ChamberMy Lords, I am conscious of the imprecation earlier from the noble Lord, Lord Stevenson of Balmacara, that we keep our contributions short, but I intend to take no notice of it. That is for the very good reason that I do not think the public would understand why we disposed of such a momentous matter as bringing to an end end-to-end encryption on private messaging services as a mere technicality and a brief debate at the end of Report.
It is my view that end-to-end encryption is assumed nowadays by the vast majority of people using private messaging services such as WhatsApp, iMessage and Signal. They are unaware, I think, of the fact that it is about to be taken from them by Clause 111 of the Bill. My amendment would prevent that. It is fairly plain; it says that
“A notice under subsection (1)”
of Clause 111
“may not impose a requirement relating to a service if the effect of that requirement would be to require the provider of the service to weaken or remove end-to-end encryption applied in relation to the service”.
My noble friend says that there is no threat of ending end-to-end encryption in his proposal, but he achieves that by conflating two things—which I admit my own amendment conflates, but I will come back to that towards the end. They are the encryption of platforms and the encryption of private messaging services. I am much less concerned about the former. I am concerned about private messaging services. If my noble friend was serious in meaning that there was no threat to end-to-end encryption, then I cannot see why he would not embrace my amendment, but the fact that he does not is eloquent proof that it is in fact under threat, as is the fact that the NSPCC and the Internet Watch Foundation are so heavily lobbying against my amendment. They would not be doing that if they did not think it had a serious effect.
I shall not repeat at any length the technical arguments we had in Committee, but the simple fact is that if you open a hole into end-to-end encryption, as would be required by this provision, then other people can get through that hole, and the security of the system is compromised. Those other people may not be very nice; they could be hostile state actors—we know hostile state actors who are well enough resourced to do this—but they could also be our own security services and others, from whom we expect protection. Normally, we do get a degree of protection from those services, because they are required to have some form of warrant or prior approval but, as I have explained previously in debate on this, these powers being given to Ofcom require no warrant or prior approval in order to be exercised. So there is a vulnerability, but there is also a major assault on privacy. That is the point on which I intend to start my conclusion.
If we reflect for a moment, the evolution of this Bill in your Lordships’ House has been characterised and shaped, to a large extent, by the offer made by the noble Lord, Lord Stevenson of Balmacara, when he spoke at Second Reading, to take a collaborative approach. But that collaborative approach has barely extended to those noble Lords concerned about privacy and freedom of expression. As a result, in my view, those noble Lords rightly promoting child protection have been reckless to the point of overreaching themselves.
If we stood back and had to explain to outsiders that we were taking steps today that took end-to-end encryption and the privacy they expect on their private messaging services away from them, together with the security and protection it gives, of course, in relation to scams and frauds and all the other things where it has a public benefit, then I think they would be truly outraged. I do not entirely understand how the Government think they could withstand that outrage, were it expressed publicly. I actually believe that the battle for this Bill—this part of this Bill, certainly—is only just starting. We may be coming to the end here, but I do not think that this Bill is settled, because this issue is such a sensitive one.
Given the manifest and widespread lack of support for my views on this question in your Lordships’ House in Committee, I will not be testing the opinion of the House today. I think I know what the opinion of the House is, but it is wrong, and it will have to be revised. My noble friend simply cannot stand there and claim that what he is proposing is proportionate and necessary, because it blatantly and manifestly is not.
My Lords, the powers in Clause 111 are perhaps the most controversial outstanding issue in the Bill. I certainly agree with the noble Lord, Lord Moylan, that they deserve some continued scrutiny. I suspect that Members of another place are being lobbied on this extensively right now. Again, it is one of the few issues; they may not have heard of the Online Safety Bill, but they will do in the context of this particular measure.
We debated the rights and wrongs of encryption at some length in Committee, and I will not repeat those points today, not least because the noble Lord, Lord Moylan, has made some of the arguments as to why encryption is important. I will instead today focus on the future process, assuming that the Clause 111 powers will be available to Ofcom as drafted and that we are not going to accept the amendment from the noble Lord, Lord Moylan.
Amendments 258 and 258ZA, in my name and that of my noble friend Lord Clement-Jones, both aim to improve the process of issuing a Clause 111 order by adding in some necessary checks and balances.
As we debate this group, we should remember that the Clause 111 powers are not specific to encrypted services—I think the Minister made this point—and we should have the broader context in mind. I often try to bring some concrete scenarios to our discussions, and it may be helpful to consider three different scenarios in which Ofcom might reach for a Clause 111 notice.
The first is where a provider has no particular objections to using technology to identify and remove child sexual exploitation and abuse material or terrorist material but is just being slow to do this. There are mature systems out there. PhotoDNA is very well known in the industry and effectively has a database with digital signatures of known child sexual exploitation material. All the services we use on a daily basis such as Facebook, Instagram and others will check uploaded photos against that database and, where it is child sexual exploitation material, they will make sure that it does not get shown and that those people are reported to the authorities.
I can imagine scenarios where Ofcom is dealing with a service which has not yet implemented the technology—but does not have a problem doing it—and the material is unencrypted so there is no technical barrier; it is just being a bit slow. In those scenarios, Ofcom will tell the service to get on with it or it will get a Clause 111 notice. In those circumstances, in most cases the service will just get on with it, so Ofcom will be using the threat of the notice as a way to encourage the slow coaches. That is pretty unexceptional; it will work in a pretty straightforward way. I think the most common use of these notices may be to bring outliers into the pack of those who are following best practice. Ofcom may not even need to issue any kind of warning notice at all and will not get past the warning notice period. Waving a warning notice in front of a provider may be sufficient to get it to move.
The second scenario is one where the provider equally does not object to the use of the technology but would prefer to have a notice before it implements it. Outside the world of tech companies, it may seem a little strange why a provider would want to be ordered to do something rather than doing the right thing voluntarily, but we have to remember that the use of this kind of technology is legally fraught in many jurisdictions. There have been court cases in a number of places, not least the European Union, where there are people who will challenge whether you should use this technology on unencrypted services, never mind encrypted ones. In those cases, you can imagine there will be providers, particularly those established outside the United Kingdom, which may say, “Look, we are fine implementing this technology, but Ofcom please can you give us a notice? Then when someone challenges it in court, we can say that the UK regulator made us do it”. That would be helpful to them. This second group will want a notice and here we will get to the point of the notice being issued. They are not going to contest it; they want to have the notice because it gives them some kind of legal protection.
I think those two groups are relatively straightforward: we are dealing with companies which are being slow or are looking for legal cover but do not fundamentally object. The third scenario, though, is the most challenging and it is where I think the Government could get into real trouble. My amendments seek to help the Government in situations where a provider fundamentally objects to being ordered to deploy a particular technology because it believes that that technology will create real privacy threats and risks to the service that it offers. I do not think the provider is being awkward in these circumstances; it has genuine concerns about the implications of the technology being developed or which it is being instructed to deploy.
In these circumstances, Ofcom may have all the reasons in the world to argue why it thinks that what it is asking for is reasonable. However, the affected provider may not accept those reasons and take quite a strong counterview and have all sorts of other arguments as to why what it is being asked to do is unacceptable and too high-risk. This debate has been swirling around at the moment as we think about current models of end-to-end encryption and client-side scanning technology, but we need to recognise that this Bill is going to be around for a while and there may be all sorts of other technologies being ordered to be deployed that we do not even know about and have not even been developed yet. At any point, we may hit this impasse where Ofcom is saying it thinks it is perfectly reasonable to order a company to do it and the service provider is saying, “No, as we look at this, our experts and our lawyers are telling us that this is fundamentally problematic from a privacy point of view”.
Just to be clear, am I right to understand my noble friend as saying that there is currently no technology that would be technically acceptable for tech companies to do what is being asked of them? Did he say that tech companies should be looking to develop the technology to do what may be required of them but that it is not currently available to them?
For clarification, if the answer to that is that the technology does not exist—which I believe is correct, although there are various snake oil salespeople out there claiming that it does, as the noble Baroness, Lady Fox of Buckley, said—my noble friend seems to be saying that the providers and services should develop it. This seems rather circular, as the Bill says that they must adopt an approved technology, which suggests a technology that has been imposed on them. What if they cannot and still get such a notice? Is it possible that these powers will never be capable of being used, especially if they do not co-operate?
To answer my noble friend Lady Stowell first, it depends on the type of service. It is difficult to give a short answer that covers the range of services that we want to ensure are covered here, but we are seeking to keep this and all other parts of the Bill technology neutral so that, as services develop, technology changes and criminals, unfortunately, seek to exploit that, technology companies can continue to innovate to keep children safe while protecting the privacy of their users. That is a long-winded answer to my noble friend’s short question, but necessarily so. Ofcom will need to make its assessments on a case- by-case basis and can require a company to use its best endeavours to innovate if no effective and accurate technology is currently available.
While I am directing my remarks towards my noble friend, I will also answer a question she raised earlier on general monitoring. General monitoring is not a legally defined concept in UK law; it is a term in European Union law that refers to the generalised monitoring of user activity online, although its parameters are not clearly defined. The use of automated technologies is already fundamental to how many companies protect their users from the most abhorrent harms, including child sexual abuse. It is therefore important that we empower Ofcom to require the use of such technology where it is necessary and proportionate and ensure that the use of these tools is transparent and properly regulated, with clear and appropriate safeguards in place for users’ rights. The UK’s existing intermediary liability regime remains in place.
Amendment 255 from my noble friend Lord Moylan seeks to prevent Ofcom imposing any requirement in a notice that would weaken or remove end-to-end encryption. He is right that end-to-end encryption should not be weakened or removed. The powers in the Bill will not do that. These powers are underpinned by proportionality and technical feasibility; if it is not proportionate or technically feasible for companies to identify child sexual exploitation abuse content on their platform while upholding users’ right to privacy, Ofcom cannot require it.
I agree with my noble friend and the noble Baroness, Lady Fox, that encryption is a very important and popular feature today. However, with technology evolving at a rapid rate, we cannot accept amendments that would risk this legislation quickly becoming out of date. Naming encryption in the Bill would risk that happening. We firmly believe that the best approach is to focus on strong safeguards for upholding users’ rights and ensuring that measures are proportionate to the specific situation, rather than on general features such as encryption.
The Bill already requires Ofcom to consider the risk that technology could result in a breach of any statutory provision or rule of law concerning privacy and whether any alternative measures would significantly reduce the amount of illegal content on a service. As I have said in previous debates, Ofcom is also bound by the Human Rights Act not to act inconsistently with users’ rights.
(1 year, 3 months ago)
Lords ChamberMy Lords, in speaking to my Amendment 186A, I hope that noble Lords will forgive me for not speaking in detail to the many other amendments in this group correctly branded “miscellaneous” by those who compile our lists for us. Many of them are minor and technical, especially the government amendments. However, that is not true of all of them: Amendment 253 in the name of the noble Lord, Lord Clement-Jones, is a substantial amendment relating to regulatory co-operation, while Amendment 275A, in the name of the noble Baroness, Lady Finlay of Llandaff, is also of some interest, relating to the reports that Ofcom is being asked to produce on technological developments.
Nor is Amendment 191A lacking in importance and substance, although—I hope I will be forgiven for saying this, not in a snarky sort of way—for those of us who are worried about the enormous powers being given to Ofcom as a result of the Bill, the idea that it should be required by statute to give guidance to coroners, who are part of the courts system, seems to me strange and worth examining more closely. There might be a more seemly way of achieving the effect that the noble Baroness, Lady Kidron, understandably wants to achieve.
I turn to my own Amendment 186A, which, I hope, ought to be relatively straightforward. It concerns the terms of service of a contract with a category 1 service provider, and it is intended to improve the rights that consumers or users of that service have. It is the case that the Government want users of those services to have the ability to enforce their rights under contract against the service providers, as set out in Clause 65, and this is entirely welcome. However, it is well known that bringing claims in contract can be an expensive and onerous business, as I have pointed out in the past, particularly when the service is provided on the one-sided terms of the service provider—often, of course, drafted under the legal system of a foreign jurisdiction.
My Lords, I will make some arguments in favour of Amendment 191A, in the name of the noble Baroness, Lady Kidron, and inject some notes of caution around Amendment 186A.
On Amendment 191A, it has been my experience that when people frequently investigate something that has happened on online services, they do it well, and well-formed requests are critical to making this work effectively. This was the case with law enforcement: when an individual police officer is investigating something online for the first time, they often ask the wrong questions. They do not understand what they can get and what they cannot get. It is like everything in life: the more you do it, the better you get at it.
Fortunately, in a sense, most coroners will only very occasionally have to deal with these awful circumstances where they need data related to the death of a child. At that point, they are going to be very dependent on Ofcom—which will be dealing with the companies day in and day out across a range of issues—for its expertise. Therefore, it makes absolute sense that Ofcom’s expertise should be distributed widely and that coroners—at the point where they need to access this information—should be able to rely on that. So Amendment 191A is very well intended and, from a practical point of view, very necessary if we are going to make this new system work as I know the noble Baroness, Lady Kidron, and I would like to see it work.
On Amendment 186A around consumer law, I can see the attraction of this, as well as some of the read-across from the United States. A lot of the enforcement against online platforms in the US takes place through the Federal Trade Commission precisely in this area of consumer law and looking at unfair and deceptive practices. I can see the attraction of seeking to align with European Union law, as the noble Lord, Lord Moylan, argued we should be doing with respect to consumer law. However, I think this would be much better dealt with in the context of the digital markets Bill and it would be a mistake to squeeze it in here. My reasons for this are about both process and substance.
In terms of process, we have not done the impact assessment on this. It is quite a major change, for two reasons. First, it could potentially have a huge impact in terms of legal costs and the way businesses will have to deal with that—although I know nobody is going to get too upset if the impact assessment says there will be a significant increase in legal costs for category 1 companies. However, we should at least flesh these things out when we are making regulations and have them in an impact assessment before going ahead and doing something that would have a material impact.
Secondly in process terms, there are some really interesting questions about the way this might affect the market. The consumer law we have does exclude services that are offered for free, because so much of consumer law is about saying, “If the goods are not delivered correctly, you get your money back”. With free services, we are clearly dealing with a different model, so the notion that we have a law that is geared towards making sure you either get the goods or you get the money may not be the best fit. To try to shoehorn in these free-at-the-point-of-use services may not be the best way to do it, even from a markets and consumer point of view. Taking our time to think about how to get this right would make sense.
More fundamentally, in terms of the substance, we need to recognise that, as a result of the Online Safety Bill, Ofcom will be requiring regulated services to rewrite their terms of service in quite a lot of detail. We see this throughout the Bill. We are going to have to do all sorts of things—we will debate other amendments in this area today—to make sure that their terms of service are conformant with what we want from them in this Bill. They are going to have to redo their complaints and redress mechanisms. All of this is going to have to change and Ofcom is going to be the regulator that tells them how to do it; that is what we are asking Ofcom to tell them to do.
My fundamental concern here, if we introduce another element, is that there is a whole different structure under consumer law where you might go to local trading standards or the CMA, or you might launch a private action. In many cases, this may overlap. The overlap is where consumer law states that goods must be provided with reasonable care and skill and in a reasonable time. That sounds great, but it is also what the Online Safety Bill is going to be doing. We do not want consumer law saying, “You need to write your terms of service this way and handle complaints this way”, and then Ofcom coming along and saying, “No, you must write your terms of service that way and handle complaints that way”. We will end up in a mess. So I just think that, from a practical point of view, we should be very focused in this Bill on getting all of this right from an Online Safety Bill point of view, and very cautious about introducing another element.
Perhaps one of the attractions of the consumer law point for those who support the amendment is that it says, “Your terms must be fair”. It is the US model; you cannot have unfair terms. Again, I can imagine a scenario in which somebody goes to court and tries to get the terms struck down because they are unfair but the platform says, “They’re the terms Ofcom told me to write. Sort this out, please, because Ofcom is saying I need to do this but the courts are now saying the thing I did was unfair because somebody feels that they were badly treated”.
Does the noble Lord accept that that is already a possibility? You can bring an action in contract law against them on the grounds that it is an unfair contract. This could happen already. It is as if the noble Lord is not aware that the possibility of individual action for breach of contract is already built into Clause 65. This measure simply supplements it.
I am certainly aware that it is there but, again, the noble Lord has just made the point himself: this supplements it. The intent of the amendment is to give consumers more rights under this additional piece of legislation; otherwise, why bother with the amendment at all? The noble Lord may be arguing against himself in saying that this is unnecessary and, at the same time, that we need to make the change. If we make the change, it is, in a sense, a material change to open the door to more claims being made under consumer law that terms are unfair. As I say, we may want this outcome to happen eventually, but I find it potentially conflicting to do it precisely at a time when we are getting Ofcom to intervene much more closely in setting those terms. I am simply arguing, “Let’s let that regime settle down”.
The net result and rational outcome—again, I am speaking to my noble friend’s Amendment 253 here—may be that other regulators end up deferring to Ofcom. If Ofcom is the primary regulator and we have told it, under the terms of the Online Safety Bill, “You must require platforms to operate in this way, handle complaints in this way and have terms that do these things, such as excluding particular forms of language and in effect outlawing them on platforms”, the other regulators will eventually end up deferring to it. All I am arguing is that, at this stage, it is premature to try to introduce a second, parallel route for people to seek changes to terms or different forms of redress, however tempting that may be. So I am suggesting a note of caution. It is not that we are starting from Ground Zero—people have routes to go forward today—but I worry about introducing something that I think people will see as material at this late stage, having not looked at the full impact of it and potentially running in conflict with everything else that we are trying to do in this legislation.
My Lords, this has indeed been a wide-ranging and miscellaneous debate. I hope that since we are considering the Bill on Report noble Lords will forgive me if I do not endeavour to summarise all the different speeches and confine myself to one or two points.
The first is to thank the noble Baroness, Lady Kidron, for her support for my amendment but also to say that having heard her argument in favour of her Amendment 191A, I think the difference between us is entirely semantic. Had she worded it so as to say that Ofcom should be under a duty to offer advice to the Chief Coroner, as opposed to guidance to coroners, I would have been very much happier with it. Guidance issued under statute has to carry very considerable weight and, as my noble friend the Minister said, there is a real danger in that case of an arm of the Executive, if you like, or a creature of Parliament—however one wants to regard Ofcom—interfering in the independence of the judiciary. Had she said “advice to the Chief Coroner and whoever is the appropriate officer in Scotland”, that would have been something I could have given wholehearted support to. I hope she will forgive me for raising that quibble at the outset, but I think it is a quibble rather than a substantial disagreement.
On my own amendment, I simply say that I am grateful to my noble friend for the brevity and economy with which he disposed of it. He was of course assisted in that by the remarks and arguments made by many other noble Lords in the House as they expressed their support for it in principle.
I think there is a degree of confusion about what the Bill is doing. There seemed to be a sense that somehow the amendment was giving individuals the right to bring actions in the courts against providers, but of course that already happens because that right exists and is enshrined in Article 65. All the amendment would do is give some balance so that consumers actually had some protections in what is normally, in essence, an unequal contest, which is trying to ensure that a large company enforces the terms and contracts that it has written.
In particular, my amendment would give, as I think noble Lords know, the right to demand repeat performance—that is, in essence, the right to put things right, not monetary compensation—and it would frustrate any attempts by providers, in drafting their own terms and conditions, to limit their own liability. That is of course what they seek to do but the Consumer Rights Act frustrates them in their ability to do so.
We will say no more about that for now. With that, I beg leave to withdraw my amendment.
My Lords, I associate myself with my noble friend Lady Fraser of Craigmaddie’s incredibly well-made points. I learned a long time ago that, when people speak very softly and say they have a very small point to make, they are often about to deliver a zinger. She really did; it was hugely powerful. I will say no more than that I wholeheartedly agree with her; thank you for helping us to understand the issue properly.
I will speak in more detail about access to data for researchers and in support of my noble friend Lord Bethell’s amendments. I too am extremely grateful to the Minister for bringing forward all the government amendments; the direction of travel is encouraging. I am particularly pleased to see the movement from “may” to “must”, but I am worried that it is Ofcom’s rather than the regulated services’ “may” that moves to “must”. There is no backstop for recalcitrant regulated services that refuse to abide by Ofcom’s guidance. As the noble Baroness, Lady Kidron, said, in other areas of the Bill we have quite reasonably resorted to launching a review, requiring Ofcom to publish its results, requiring the Secretary of State to review the recommendations and then giving the Secretary of State backstop powers, if necessary, to implement regulations that would then require regulated companies to change.
I have a simple question for the Minister: why are we not following the same recipe here? Why does this differ from the other issues, on which the House agrees that there is more work to be done? Why are we not putting backstop powers into the Bill for this specific issue, when it is clear to all of us that it is highly likely that there will be said recalcitrant regulated firms that are not willing to grant access to their data for researchers?
Before my noble friend the Minister leaps to the hint he gave in his opening remarks—that this should all be picked up in the Data Protection and Digital Information Bill—unlike the group we have just discussed, this issue was discussed at Second Reading and given a really detailed airing in Committee. This is not new news, in the same way that other issues where we have adopted the same recipe that includes a backstop are being dealt with in the Bill. I urge my noble friend the Minister to follow the good progress so far and to complete the package, as we have in other areas.
My Lords, it is valuable to be able to speak immediately after my noble friend Lady Harding of Winscombe, because it gives me an opportunity to address some remarks she made last Wednesday when we were considering the Bill on Report. She suggested that there was a fundamental disagreement between us about our view of how serious online safety is—the suggestion being that somehow I did not think it was terribly important. I take this opportunity to rebut that and to add to it by saying that other things are also important. One of those things is privacy. We have not discussed privacy in relation to the Bill quite as much as we have freedom of expression, but it is tremendously important too.
Government Amendment 247A represents the most astonishing level of intrusion. In fact, I find it very hard to see how the Government think they can get away with saying that it is compatible with the provisions of the European Convention on Human Rights, which we incorporated into law some 20 years ago, thus creating a whole law of privacy that is now vindicated in the courts. It is not enough just to go around saying that it is “proportionate and necessary” as a mantra; it has to be true.
This provision says that an agency has the right to go into a private business with no warrant, and with no let or hindrance, and is able to look at its processes, data and equipment at will. I know of no other business that can be subjected to that without a warrant or some legal process in advance pertinent to that instance, that case or that business.
My noble friend Lord Bethell said that the internet has been abused by people who carry out evil things; he mentioned terrorism, for example, and he could have mentioned others. However, take mobile telephones and Royal Mail—these are also abused by people conducting terrorism, but we do not allow those communications to be intruded into without some sort of warrant or process. It does not seem to me that the fact that the systems can be abused is sufficient to justify what is being proposed.
My noble friend the Minister says that this can happen only offline. Frankly, I did not understand what he meant by that. In fact, I was going to say that I disagreed with him, but I am moving to the point of saying that I think it is almost meaningless to say that it is going to happen offline. He might be able to explain that. He also said that Ofcom will not see individual traffic. However, neither the point about being offline nor the point about not seeing individual traffic is on the face of the Bill.
When we ask ourselves what the purpose of this astonishing power is—this was referred to obliquely to some extent by the noble Baroness, Lady Fox of Buckley—we can find it in Clause 91(1), to which proposed new subsection (2A) is being added or squeezed in subordinate to it. Clause 91(1) talks about
“any information that they”—
that is, Ofcom—
“require for the purpose of exercising, or deciding whether to exercise, any of their online safety functions”.
The power could be used entirely as a fishing expedition. It could be entirely for the purpose of educating Ofcom as to what it should be doing. There is nothing here to say that it can have these powers of intrusion only if it suspects that there is criminality, a breach of the codes of conduct or any other offence. It is a fishing expedition, entirely for the purpose of
“exercising, or deciding whether to exercise”.
Those are the intrusions imposed upon companies. In some ways, I am less concerned about the companies than I am about what I am going to come to next: the intrusion on the privacy of individuals and users. If we sat back and listened to ourselves and what we are saying, could we explain to ordinary people—we are going to come to this when we discuss end-to-end encryption—what exactly can happen?
Two very significant breaches of the protections in place for privacy on the internet arise from what is proposed. First, if you allow someone into a system and into equipment, especially from outside, you increase the risk and the possibility that a further, probably more hostile party that is sufficiently well-equipped with resources—we know state actors with evil intent which are so equipped—can get in through that or similar holes. The privacy of the system itself would be structurally weakened as a result of doing this. Secondly, if Ofcom is able to see what is going on, the system becomes leaky in the direction of Ofcom. It can come into possession of information, some of which could be of an individual character. My noble friend says that it will not be allowed to release any data and that all sorts of protections are in place. We know that, and I fully accept the honesty and integrity of Ofcom as an institution and of its staff. However, we also know that things get leaked and escape. As a result of this provision, very large holes are being built into the protections of privacy that exist, yet there has been no reference at all to privacy in the remarks made so far by my noble friend.
I finish by saying that we are racing ahead and not thinking. Good Lord, my modest amendment in the last group to bring a well-established piece of legislation—the Consumer Rights Act—to bear upon this Bill was challenged on the grounds that there had not been an impact assessment. Where is the impact assessment for this? Where is even the smell test for this in relation to explaining it to the public? If my noble friend is able to expatiate at the end on the implications for privacy and attempt to give us some assurance, that would be some consolation. I doubt that he is going to give way and do the right thing and withdraw this amendment.
My Lords, the debate so far has been—in the words of the noble Baroness, Lady Fox—a Committee debate. That is partly because this set of amendments from the Government has come quite late. If they had been tabled in Committee, I think we would have had a more expansive debate on this issue and could have knocked it about a bit and come back to it on Report. The timing is regrettable in all of this.
That said, the Government have tabled some extremely important amendments, particularly Amendments 196 and 198, which deal with things such as algorithms and functionalities. I very much welcome those important amendments, as I know the noble Baroness, Lady Kidron, did.
I also very much support Amendments 270 and 272 in the name of the noble Baroness, Lady Fraser. I hope the Minister, having been pre-primed, has all the answers to them. It is astonishing that, after all these years, we are so unattuned to the issues of the devolved Administrations and that we are still not in the mindset on things such as research. We are not sufficiently granular, as has been explained—let alone all the other questions that the noble Lord, Lord Stevenson, asked. I hope the Minister can unpack some of that as well.
I want to express some gratitude, too, because the Minister and his officials took the trouble to give us a briefing about remote access issues, alongside Ofcom. Ofcom also sent through its note on algorithmic assessment powers, so an effort has been made to explain some of these powers. Indeed, I can see the practical importance, as explained to us. It is partly the lateness, however, that sets off what my noble friend Lord Allan called “trigger words” and concerns about the remote access provisions. Indeed, I think we have a living and breathing demonstration of the impact of triggers on the noble Lord, Lord Moylan, because these are indeed issues that concern those outside the House to quite a large degree.
My Lords, continuing the rather radical approach of debating an amendment that has already been debated in Committee and has not just been introduced, and picking up on the theme of our debate immediately before we adjourned, I move an amendment that seeks to address the question of the Government’s activities in interacting with providers when they seek to influence providers on what is shown on their sites.
It might be a matter of interest that according to the Daily Telegraph, which I implicitly trust, only on Tuesday of last week, a judge in Louisiana in the United States issued an injunction forbidding a lengthy list of White House officials from making contact with social media companies to report misinformation. I say this not because I expect the jurisprudence of the state of Louisiana to have any great influence in your Lordships’ House but simply to show how sensitive and important this issue is. The judge described what he had heard and seen as one of the greatest assaults on free speech in the history of the United States.
We are not necessarily quite in that territory, and nor does my amendment do anything so dramatic as to prevent the Government communicating with providers with a view to influencing their content, but Amendment 225 requires the Secretary of State to produce a report within six months of the passing of the Act, and every six months thereafter, in which he sets out
“any relevant representations His Majesty’s Government have made to providers”
that are
“intended to persuade or encourage a provider”
to do one of three things. One is to
“modify the terms of service of a regulated service in an effort to address misinformation or disinformation”;
one is to
“restrict or remove a particular user’s access to accounts used by them”;
and the third is to
“take down, reduce the visibility of, or restrict access to content that is present or may be encountered on a regulated service”.
None of these things would be prohibited or prevented by this amendment, but it would be required that His Majesty’s Government produce a report saying what they have done every six months.
Very importantly there is an exception, in that there would be no obligation on the Secretary of State to disclose publicly any information that affected national security, but he would be required in that case to make a report to the Intelligence and Security Committee here in Parliament. As I said, this is a very sensitive subject, and remarks made by the noble Baroness, Lady Fox of Buckley, in the previous debate referred in particular to this subject in connection with the pandemic. While that is in the memory, other topics may easily come up and need to be addressed, where the Government feel obliged to move and take action.
We know nothing about those contacts, because they are not instructions or actions taken under law. They are simply nudges, winks and phone conversations with providers that have an effect and, very often, the providers will act on them. Requiring the Government to make a report and say what they have done seems a modest, proportionate and appropriate means to bring transparency to this exercise, so that we all know what is going on.
I have addressed the points made by the noble Baroness and my noble friend already. She asks the same question again and I can give her the same answer. We are operating openly and transparently here, and the Bill sets out further provisions for transparency and accountability.
My Lords, I see what my noble friend did there, and it was very cunning. He gave us a very worthwhile account of the activities of the Counter Disinformation Unit, a body I had not mentioned at all, as if the Counter Disinformation Unit was the sole locus of this sort of activity. I had not restricted it to that. We know, in fact, that other bodies within government have been involved in undertaking this sort of activity, and on those he has given us no answer at all, because he preferred to answer about one particular unit. He referred also to its standardised transparency processes. I can hardly believe that I am reading out words such as those. The standardised transparency process allows us all to know that encounters take place but still refuses to let us know what actually happens in any particular encounter, even though there is a great public interest in doing so. However, I will not press it any further.
My noble friend, who is genuinely a friend, is in danger of putting himself, at the behest of civil servants and his ministerial colleagues, in some danger. We know what happens in these cases. The Minister stands at the Dispatch Box and says “This has never happened; it never normally happens; it will not happen. Individuals are never spoken of, and actions of this character are never taken”. Then of course, a few weeks or months later, out pour the leaked emails showing that all these things have been happening all the time. The Minister then has to resign in disgrace and it is all very sad. His friends, like myself, rally round and buy him a drink, before we never see him again.
Anyway, I think my noble friend must be very careful that he does not put himself in that position. I think he has come close to doing so this evening, through the assurances he has given your Lordships’ House. Although I do not accept those assurances, I will none the less withdraw the amendment, with the leave of the House.
(1 year, 3 months ago)
Lords ChamberMy Lords, I will speak to the amendments in this group in my name: Amendments 139, 140, 144 and 145. I thank the noble Lords, Lord Stevenson and Lord Clement-Jones, and the noble Viscount, Lord Colville, for signing those amendments and for their continued support on this group. I am also grateful to my noble friend the Minister and his team for engaging with me on the issue of Secretary of State powers. He has devoted a lot of time and energy to this, which is reflected in the wide- ranging group of amendments tabled by him.
Before I go any further, it is worth emphasising that the underlying concern here is making sure that we have confidence, through this new regulation regime, that the Bill strikes the right balance of power between government, Parliament, the regulator and big tech firms. The committee that I chair—the Communications and Digital Select Committee of your Lordships’ House—has most focused on that in our consideration of the Bill. I should say also that the amendments I have brought forward in my name very much have the support of the committee as well.
These amendments relate to Clause 39, which is where the main issue lies in the context of Secretary of State powers, and we have three broad concerns. First, as it stood, the Bill handed the Secretary of State unprecedented powers to direct the regulator on pretty much anything. Secondly, these powers allowed the Government to conduct an infinite form of ping-pong with the regulator, enabling the Government to prevail in a dispute. Thirdly, this ping-pong could take place in private with no possibility of parliamentary oversight or being able to intervene, as would be appropriate in the event of a breakdown in the relationship between executive and regulator.
This matters because the Online Safety Bill creates a novel form for regulating the internet and what we can or cannot see online, in particular political speech, and it applies to the future. It is one thing for the current Government, who I support, to say that they would never use the powers in this way. That is great but, as we know, current Governments cannot speak for whoever is in power in the generations to come, so it is important that we get this right.
As my noble friend said, he has brought forward amendments to Clause 39 that help to address this. I support him in and commend him for that. The original laundry list of powers to direct Ofcom has been shortened and now follows the precedent set out in the Communications Act 2003. The government amendments also say that the Secretary of State must now publish their directions to Ofcom, which will improve transparency, and once the code is agreed Ofcom will publish changes so that Parliament can see what changes have been made and why. These are all very welcome and, as I say, they go a long way to addressing some of our concerns, but two critical issues remain.
First, the Government retain an opt-out, which means that they do not have to publish their directions if the Secretary of State believes that doing so would risk
“national security or public safety”,
or international relations. However, those points are now the precise grounds on which the Secretary of State may issue a direction and, if history is any guide, there is a real risk that we will never hear about the directions because the Government have decided that they are a security issue.
My Amendments 139 and 140 would require the Secretary of State to at least notify Parliament of the fact that a direction has been issued and what broad topic it relates to. That would not require any details to be published, so it does not compromise security, but it does give assurance that infinite, secretive ping-pong is not happening behind the scenes. My noble friend spoke so quickly at the beginning that I was not quite sure whether he signalled anything, but I hope that he may be able to respond enthusiastically to Amendments 139 and 140.
Secondly, the Government still have powers for infinite ping-pong. I appreciate that the Government have reservations about capping the number of exchanges between the Secretary of State and Ofcom, but they must also recognise the concern that they appear to be preparing the ground for any future Government to reject infinitely the regulator’s proposals and therefore prevail in a dispute about a politically contentious topic. My Amendments 144 and 145 would clarify that the Government will have a legally binding expectation that they will use no more than the bare minimum number of directions to achieve the intent set out in their first direction.
The Government might think that adding this to the Bill is superfluous, but it is necessary in order to give Parliament and the public confidence about the balance of power in this regime. If Parliament felt that the Secretary of State was acting inappropriately, we would have sufficient grounds to intervene. As I said, the Government acknowledged in our discussions the policy substance of these concerns, and as we heard from my noble friend the Minister in introducing this group, there is an understanding on this. For his part, there is perhaps a belief that what they have done goes far enough. I urge him to reconsider Amendments 144 and 145, and I hope that, when he responds to the debate on this group, he can say something about not only Amendments 139 and 140 but the other two amendments that will give me some grounds for comfort.
My Lords, I realise that I am something of a fish out of water in this House, as I was in Committee, on the Bill, which is fundamentally flawed in a number of respects, including its approach to governance, which we are discussing today. Having said that, I am generally sympathetic to the amendments proposed by my noble friend Lady Stowell of Beeston. If we are to have a flawed approach, her amendments would improve it somewhat.
However, my approach is rather different and is based on the fairly simple but important principle that we live in a free democracy. If we are to introduce a new legislative measure such as this Bill, which has far- reaching powers of censorship taking us back 70 or 80 years in terms of the freedom of expression we have been able to develop since the 1950s and 1960s— to the days of Lady Chatterley’s Lover and the Lord Chamberlain, in equivalent terms, as far as the internet and the online world are concerned—then decisions of such a far-reaching character affecting our lives should be taken by somebody who is democratically accountable.
My approach is utterly different from that which my noble friend on the Front Bench has proposed. He has proposed amendments which limit yet further the Secretary of State’s power to give directions to Ofcom, but the Secretary of State is the only party in that relationship who has a democratic accountability. We are transferring huge powers to a completely unaccountable regulator, and today my noble friend proposes transferring, in effect, even more powers to that unaccountable regulator.
To go back to a point that was discussed in Committee and earlier on Report, if Ofcom takes certain decisions which make it impossible for Wikipedia to operate its current model, such that it has to close down at least its minority language websites—my noble friend said that the Government have no say over that and no idea what Ofcom will do—to whom do members of the public protest? To whom do they offer their objections? There is no point writing to the Secretary of State because, as my noble friend told us, they will not have had any say in the matter and we in this House will have forsworn the opportunity, which I modestly proposed, to take those powers here. There is no point writing to their MP, because all their MP can do is badger the Secretary of State. It is a completely unaccountable structure that is completely indefensible in a modern democratic society. So I object to the amendments proposed by my noble friend, particularly Amendments 136 and 137.
My Lords, not for the first time I find myself in quite a different place from my noble friend Lord Moylan. Before I go through some detailed comments on the amendments, I want to reflect that at the root of our disagreement is a fundamental view about how serious online safety is. The logical corollary of my noble friend’s argument is that all decisions should be taken by Secretaries of State and scrutinised in Parliament. We do not do that in other technical areas of health and safety in the physical world and we should not do that in the digital world, which is why I take such a different view—
(1 year, 3 months ago)
Lords ChamberMy Lords, I am rather disappointed that, while this is a large group on freedom of expression, it is dominated by amendments by myself and the noble Lord, Lord Moylan. I welcome the noble Baroness, Lady Fraser of Craigmaddie, and the noble Lord, Lord Stevenson of Balmacara, dipping their toes in the free-expression water here and I am glad that the Minister has added his name to their amendment, although it is a shame that he did not add his name to one of mine.
Earlier today we heard a lot of congratulations to the Government for listening. I have to say, it depends who you are, because the Government have not listened to all of us. It is notable that, of the hundreds of new government concessions that have taken the form of amendments on Report, none relates to free speech. Before I go through my amendments, I want to note that, when the noble Lord, Lord Moylan, and I raise concerns about free speech, it can be that we get treated as being slightly eccentric. There has been a generally supportive and generous mood from the regulars in this House. I understand that, but I worry that free speech is being seen as peripheral.
This country, our country, that we legislate for and in, has a long history of boasting that it is the home of liberty and adopts the liberal approach that being free is the default position: that free speech and the plurality and diversity of views it engenders are the cornerstone of democracy in a free society and that any deviation from that approach must require extraordinary and special justification. A comprehensive piece of law, such as the one we are dealing with, that challenges many of those norms, deserves thorough scrutiny through the prism of free speech.
When I approached this Bill, which I had been following long before I arrived in this House, I assumed that there would be packed Benches—as there are on the Illegal Migration Bill—and that everybody, including all these Law Lords, would be in quoting the European Court of Human Rights on Article 8 and Article 10. I assumed there would be complaints about Executive power grabs and so on. But it has been a bit sparse.
That is okay; I can live with that, even if it is a bit dispiriting. But I am concerned when the Government cite that the mood of the Committee has been reflected in their amendments, because it has not been a very large Committee. Many of the amendments that I, the noble Lord, Lord Moylan, and others tabled about free expression represent the concerns of a wide range of policy analysts, civil rights groups, academics, lawyers, free speech campaigners and industry representatives. They have been put forward in good faith—I continue to do that—to suggest ways of mitigating some of the grave threats to free speech in this Bill, with constructive ideas about how to tackle flaws and raising some of the problems of unintended consequences. I have, at times, felt that those concerns were batted away with a certain indifference. Despite the Minister being very affable and charming, none the less it can be a bit disappointing.
Anyway, I am here to bat again. I hope that the Government now will listen very closely and consider how to avoid the UK ending up with the most restrictive internet speech laws of any western democracy at the end of this. I have a lot of different amendments in my name in this group. I wholeheartedly support the amendments in the name of the noble Lord, Lord Moylan, requiring Ofcom to assess the impact of its codes on free speech, but I will not speak to them.
I will talk about my amendments, starting with Amendments 77, 78, 79, 80 and 81. These require platforms to have particular regard to freedom of expression, not just when implementing safety measures and policies but when writing their terms of service. This is to ensure that freedom of expression is not reduced to an abstract “have regard to” secondary notion but is visible in drafting terms of services. This would mean that users know their rights in clear and concrete terms. For example, a platform should be expected to justify how a particular term of service, on something such as religious hatred, will be balanced with consideration of freedom of expression and conscience, in order to allow discussions over different beliefs to take place. Users need to be able to point to specific provisions in the terms of service setting out their free speech protections.
This is all about parity between free speech and safety. Although the Government—and I welcome this—have attempted some balance, via Clause 18, to mitigate the damage to individual rights of free expression from the Bill, it is a rather weak, poor cousin. We need to recognise that, if companies are compelled to prevent and minimise so-called harmful content via operational safety duties, these amendments are saying that there should be parity with free expression. They should be compelled to do the same on freedom of expression, with a clear and positive duty, rather than Clause 64, which is framed rather negatively.
Amendment 188 takes on the issue of terms of service from a different direction, attempting to ensure that duties with regard to safety must not be allowed to restrict lawful expression or that protected by Article 10 of the European Convention on Human Rights. That states that interference in free speech rights is not lawful unless it is a last resort. I note, in case anyone is reading the amendment carefully, and for Hansard, that the amendment cites Article 8—a rather Freudian slip on my part that was not corrected by the Table Office. That is probably because privacy rights are also threatened by the Bill, but I meant Article 10 of course.
Amendment 188 addresses a genuine dilemma in terms of Ofcom enforcing safety duties via terms and conditions. These will transform private agreements between companies and users into statutory duties under Clause 65. This could mean that big tech companies would be exercising public law functions by state-backed enforcement of the suppression of lawful speech. One worry is that platforms’ terms of service are not neutral; they can change due to external political or commercial pressures. We have all been following with great interest what is happening at Twitter. They are driven by values which can be at odds with UK laws. So I hope the Minister will answer the query that this amendment poses: how is the UK able to uphold its Article 10 obligations if state regulators are legally instructed to enforce terms of service attitudes to free speech, even when they censor far more than UK domestic law requires?
Amendment 162 has a different focus and removes offences under Section 5 of the Public Order Act from the priority offences to be regulated as priority illegal content, as set out in Schedule 7. This amendment is prompted by a concern that the legislation enlists social media companies to act as a private online police force and to adjudicate on the legality of online content. This is especially fraught in terms of the legal limits on speech, where illegality is often contested and contentious—offline as well as online.
The inclusion of Section 5 would place a duty on service providers to take measures to prevent individuals ever encountering content that includes
“threatening or abusive words or behaviour, or disorderly behaviour”
that is likely to cause “harassment, alarm or distress”. It would also require service providers to minimise the length of time such content is present on the service.
I am not sure whether noble Lords have been following the dispute that broke out over the weekend. There is a film on social media doing the rounds of a trans speaker, Sarah Jane Baker, at the Trans Pride event screaming pretty hysterically “If you see a TERF, punch them in the effing face”—and I am being polite. You would think that that misogynistic threat would be the crime people might be concerned about, yet some apologists for Trans Pride claim that those women—TERFs such as myself—who are outraged, and have been treating the speech as saying that, are the ones who are stirring up hate.
Now, that is a bit of a mess, but asking service providers, or indeed algorithms, to untangle such disputes can surely lead only to the over-removal of online expression, or even more of a muddle. As the rule of law charity Justice points out, this could also catch content that depicts conflict or atrocities, such as those taking place in the Russia-Ukraine war. Justice asks whether the inclusion of Section 5 of the POA could lead to the removal of posts by individuals sharing stories of their own abuse or mistreatment on internet support forums.
Additionally, under Schedule 7 to the Bill, versions of Section 5 could also be regulated as priority illegal conduct, meaning that providers would have to remove or restrict content that, for instance, encourages what is called disorderly behaviour that is likely to cause alarm. Various organisations are concerned that this could mean that content that portrayed protest activity, that might be considered disorderly by some, was removed unless you condemned it, or even that content which encouraged people to attend protests would be in scope.
I am not a fan of Section 5 of the Public Order Act, which criminalises stirring up hatred, at the best of times, but at least those offences have been and are subject to the full rigour of the criminal justice system and case law. Of course, the courts, the CPS and the police are also bound, for example by Article 10, to protect free speech. But that is very different to compelling social media companies, their staff or automated algorithms to make such complex assessments of the Section 5 threshold of illegality. Through no fault of their own, those companies are just not qualified to make such determinations, and it is obvious that that could mean that legitimate speech will end up being restricted. Dangerously, it also makes a significant departure from the UK’s rule of law in deciding what is legal or illegal speech. It has the potential to limit UK users’ ability to engage in important aspects of public life, and prevent victims of abuse from sharing their stories, as I have described.
I turn finally to the last amendment, Amendment 275—I will keep this short, for time’s sake. I will not go into detail, but I hope that the Minister will take a look at it, see that there is a loophole, and discuss it with the department. In skeleton form, the Free Speech Union has discovered that the British Board of Film Classification runs a mobile classification network, an agreement with mobile network providers that means that it advises mobile providers on what content should be filtered because it is considered suitable for adults only. This arrangement is private, not governed by statute, and as such means that even the weak free speech safeguards in this Bill can be sidestepped. This affects not only under-18s but anyone with factory settings on their phone. It led to a particular bizarre outcome when last year the readers of the online magazine, “The Conservative Woman”, reported that the website was inaccessible. This small online magazine was apparently blacklisted by the BBFC because of comments below the line on its articles. The potential for such arbitrary censorship is a real concern, and the magazine cannot even appeal to the BBFC, so I ask the Minister to take this amendment back to the DCMS, which helped set up this mobile classification network, and find out what is going on.
That peculiar tale illustrates my concerns about what happens when free speech is not front and centre, even when you are concerned about safety and harm. I worry that when free speech is casually disregarded, censorship and bans can become the default, and a thoughtless option. That is why I urge the Minister before Third Reading to at least make sure that some of the issues and amendments in this group are responded to positively.
My Lords, my noble friend on the Front Bench said at various points when we were in Committee that the Bill struck an appropriate balance between protecting the rights of children and the rights of those wishing to exercise their freedom of expression. I have always found it very difficult indeed to discern that point of balance in the Bill as originally drafted, but I will say that if there were such a point, it has been swamped by the hundreds of amendments tabled to the Bill by my noble friend since Committee which push the Bill entirely in the opposite direction.
Among those amendments, I cannot find—it may be my fault, because I am just looking by myself; I have no help to find these things—a single one which seeks to redress the balance back in favour of freedom of expression. My Amendments 123, 128, 130, 141, 148 and 244 seek to do that to some extent, and I am grateful to the noble Baroness, Lady Fox of Buckley, for the support she has expressed for them.
I understand the point the noble Lord is making but, if he were thrown out, sacked or treated in some other way that was incompatible with his rights to freedom of expression under Article 10 of the European convention, he would have cause for complaint and, possibly, cause for legal redress.
That point is well made. In support of that, if the public space treated me in a discriminatory way, I would expect to have redress, but I do not think I have a right in every public space to say everything I like in the classic Article 8 sense. My right vis-à-vis the state is much broader than my right vis-à-vis any public space that I am operating in where norms apply as well as my basic legal rights. Again, to take the pub example, if I went in and made a racist speech, I may well be thrown out of the pub even though it is sub-criminal and the police are never called; they do not need to be as the space itself organises it.
I am making the point that terms of service are about managing these privately managed public services, and it would be a mistake to equate them entirely with our right to speak or the point at which the state can step in and censor us. I understand the point about state interference but it cuts both ways: both the state interfering in excessively censoring what we can say but also the state potentially interfering in the management of what is, after all, a private space. To refer back to the US first amendment tradition, a lot of that was about freedom of religion and precisely about enabling heterodoxy. The US did not want an orthodoxy in which one set of rules applied everywhere to everybody. Rather, it wanted people to have the right to dissent, including in ways that were exclusive. You could create your own religious sect and you could not be told not to have those beliefs.
Rolling that power over to the online world, online services, as long as they are non-discriminatory, can have quite different characters. Some will be very restrictive of speech like a restrictive religious sect; some will be very open and catholic, with a small “c”, in the sense of permitting a broad range of speech. I worry about some of the amendments in case there is a suggestion that Ofcom would start to tell a heterodox community of online services that there is an orthodox way to run their terms of service; I would rather allow this to be a more diverse environment.
Having expressed some concerns, I am though very sympathetic to Amendment 162 on Section 5 of the Public Order Act. I have tried in our debates to bring some real experience to this. There are two major concerns about the inclusion of the Public Order Act in the Bill. One is a lack of understanding of what that means. If you look at the face of the language that has been quoted at us, and go back to that small service that does not have a bunch of lawyers on tap, it reads as though it is stopping any kind of abusive content. Maybe you will google it, as I did earlier, and get a little thing back from the West Yorkshire Police. I googled: “Is it illegal to swear in the street?”. West Yorkshire Police said, “Yes, it is”. So if you are sitting somewhere googling to find out what this Public Order Act thing means, you mind end up thinking, “Crikey, for UK users, I have to stop them swearing”. There is a real risk of misinterpretation.
The second risk is that of people deliberately gaming the system; again, I have a real-life example from working in one of the platforms. I had people from United Kingdom law enforcement asking us to remove content that was about demonstrations by far-right groups. They were groups I fundamentally disagree with, but their demonstrations did not appear to be illegal. The grounds cited were that, if you allow this content to go ahead and the demonstration happens, there will be a Public Order Act offence. Once you get that on official notepaper, you have to be quite robust to say, “No, I disagree”, which we did on occasion.
I think there will be other services that receive Public Order Act letters from people who seem official and they will be tempted to take down content that is entirely legal. The critical thing here is that that content will often be political. In other parts of the Bill, we are saying that we should protect political speech, yet we have a loophole here that risks that.
I am sure the Minister will not concede these amendments, but I hope he will concede that it is important that platforms are given guidance so that they do not think that somebody getting upset about a political demonstration is sufficient grounds to remove the content as a Public Order Act offence. If you are a local police officer it is much better to get rid of that EDL demonstration, so you write to the platform and it makes your life easier, but I do not think that would be great from a speech point of view.
Finally, I turn to the point made by the noble Lord, Lord Moylan, on Amendment 188 about the ECHR Article 8 exemption. As I read it, if your terms of service are not consistent with ECHR Article 8—and I do not think they will be for most platforms—you then get an exemption from all the other duties around appeals and enforcing them correctly. It is probably a probing amendment but it is a curious way of framing it; it essentially says that, if you are more restrictive, you get more freedom in terms of the Ofcom relationship. I am just curious about the detail of that amendment.
It is important that we have this debate and understand this relationship between the state, platforms and terms of service. I for one am persuaded that the general framework of the Bill makes sense; there are necessary and proportionate restrictions. I am strongly of the view that platforms should be allowed to be heterodox in their terms of service. Ofcom’s job is very much to make sure that they are done correctly but not to interfere with the content of those terms of service beyond that which is illegal. I am persuaded that we need to be extraordinarily careful about including Public Order Act offences; that particular amendment needs a good hearing.
My Lords, before my noble friend sits down, perhaps I could seek a point of clarification. I think I heard him say, at the beginning of his response to this short debate, that providers will be required to have terms of service which respect users’ rights. May I ask him a very straightforward question: do those rights include the rights conferred by Article 10 of the European Convention on Human Rights? Put another way, is it possible for a provider operating in the United Kingdom to have terms and conditions that abridge the rights conferred by Article 10? If it is possible, what is the Government’s defence of that? If it is not possible, what is the mechanism by which the Bill achieves that?
As I set out, I think my noble friend and the noble Baroness, Lady Fox, are not right to point to the European Convention on Human Rights here. That concerns individuals’ and entities’ rights
“to receive and impart ideas without undue interference”
by public authorities, not private entities. We do not see how a service provider deciding not to allow certain types of content on its platform would engage the Article 10 rights of the user, but I would be very happy to discuss this further with my noble friend and the noble Baroness in case we are talking at cross-purposes.
(1 year, 4 months ago)
Lords ChamberMy Lords, I speak to Amendments 56, 58, 63 and 183 in my name in this group. I have some complex arguments to make, but time is pressing, so I shall attempt to do so as briefly as possible. I am assisted in that by the fact that my noble friend on the Front Bench very kindly explained that the Government are not going to accept my worthless amendments, without actually waiting to hear what it is I might have said on their behalf.
None the less, I turn briefly to Amendment 183. The Bill has been described, I think justly, as a Twitter-shaped Bill: it does not take proper account of other platforms that operate in different ways. I return to the question of Wikipedia, but also platforms such as Reddit and other community-driven platforms. The requirement for a user-verification tool is of course intended to lead to the possibility that ordinary, unverified users—people like you and me—could have the option to see only that content which comes from those people who are verified.
This is broadly a welcome idea, but when we combine that with the fact that there are community-driven sites such as Wikipedia where there are community contributions and people who contribute to those sites are not always verified—sometimes there are very good reasons why they would want to preserve their anonymity —we end up with the possibility of whole articles having sentences left out and so on. That is not going to happen; the fact is that nobody such as Wikipedia can operate a site like that, so it is another one of those existential questions that the Government have not properly grappled with and really must address before we come to Third Reading, because this will not work the way it is.
As for my other amendments, they are supportive of and consistent with the idea of user verification, and they recognise—as my noble friend said—that user verification is intended to be a substitute for the abandoned “legal but harmful” clause. I welcome the abandonment of that clause and recognise that this provision is more consistent with individual freedom and autonomy and the idea that we can make choices of our own, but it is still open to the possibility of abuse by the platforms themselves. The amendments that I am put forward address, first, the question of what should be the default position. My argument is that the default position should be that filtering is not on and that one has to opt into it, because that that seems to me the adult proposition, the adult choice.
The danger is that the platforms themselves will either opt you into filtering automatically as the default, so you do not see what might be called the full-fat milk that is available on the internet, or that they harass you to do so with constant pop-ups, which we already get. If you go on the Nextdoor website, you constantly get the pop-up saying, “You should switch on notifications”. I do not want notifications; I want to look at it when I want to look at it. I do not want notifications, but I am constantly being driven into pressing the button that says, “Switch on notifications”. You could have something similar here—constantly being driven into switching on the filters—because the platforms themselves will be very worried about the possibility that you might see illegal content. We should guard against that.
Secondly, on Amendment 58, if we are going to have user verification—as I say, there is a lot to be said for that approach—it should be applied consistently. If the platform decides to filter out racist abuse and you opt in to filtering out racist abuse or some other sort of specified abuse, it has to filter all racist abuse, not simply racist abuse that comes from people they do not like; or, with gender assignment abuse, they cannot filter out stuff from only one side or other of the argument. The word “consistently” that is included here is intended to address that, and to require policies that show that, if you opt in to having something filtered out, it would be done on a proper, consistent and systematic basis and not influenced by the platform’s own particular political views.
Finally, we come to Amendment 63 and the question of how this is communicated to users of the internet. This amendment would force the platforms to make these policies about how user verification will operate a part of their terms and conditions in a public and visible way and to ensure that those provisions are applied consistently. It goes a little further than the other amendments—the others could stand on their own—but would also add a little bit more by requiring public and consistent policies that people can see. This works with the grain of what the Government are trying to do; I do not see that the Government can object to any of this. There is nothing wrecking here. It is trying to make everything more workable, more transparent and more obvious.
I hope, given the few minutes or short period of time that will elapse between my sitting down and the Minister returning to the Dispatch Box, that he will have reflected on the negative remarks that he made in his initial speech and will find it possible to accept these amendments now that he has heard the arguments for them.
(1 year, 4 months ago)
Lords ChamberMy Lords, I too support the Minister’s Amendment 1. I remember vividly, at the end of Second Reading, the commitments that we heard from both Front-Benchers to work together on this Bill to produce something that was collaborative, not contested. I and my friends on these Benches have been very touched by how that has worked out in practice and grateful for the way in which we have collaborated across the whole House. My plea is that we can use this way of working on other Bills in the future. This has been exemplary and I am very grateful that we have reached this point.
My Lords, I am grateful to my noble friend the Minister for the meeting that he arranged with me and the noble Baroness, Lady Fox of Buckley, on Monday of this week.
Although we are on Report, I will start with just one preliminary remark of a general character. The more closely one looks at this Bill, the clearer it is that it is the instrument of greatest censorship that we have introduced since the liberalisation of the 1960s. This is the measure with the greatest capacity for reintroducing censorship. It is also the greatest assault on privacy. These principles will inform a number of amendments that will be brought forward on Report.
Turning now to the new clause—I have no particular objection to there being an introductory clause—it is notable that it has been agreed by the Front Benches and by the noble Baroness, Lady Kidron, but that it has not been discussed with those noble Lords who have spoken consistently and attended regularly in Committee to speak up in the interests of free speech and privacy. I simply note that as a fact. There has been no discussion about it with those who have made those arguments.
Now, it is true that the new clause does refer to both free speech and privacy, but it sounds to me very much as though these are written almost as add-ons and afterthoughts. We will be testing, as Report stage continues, through a number of amendments, whether that is in fact the case or whether that commitment to free speech and privacy is actually being articulated and vindicated in the Bill.
My Lords, needless to say, I disagree with what the noble Lord, Lord Moylan, has just been saying precisely because I believe that the new clause that the Minister has put forward, which I have signed and has support across the House, expresses the purpose of the Bill in the way that the original Joint Committee wanted. I pay tribute to the Minister, who I know has worked extremely hard, in co-operation with the noble Lord, Lord Stevenson of Balmacara, to whom I also pay tribute for getting to grips with a purpose clause. The noble Baronesses, Lady Kidron and Lady Harding, have put their finger on it: this is more about activity and design than it is about content, and that is the reason I fundamentally disagree with the noble Lord, Lord Moylan. I do not believe that will be the impact of the Bill; I believe that this is about systemic issues to do with social media, which we are tackling.
I say this slightly tongue-in-cheek, but if the Minister had followed the collective wisdom of the Joint Committee originally, perhaps we would not have worked at such breakneck speed to get everything done for Report stage. I believe that the Bill team and the Minister have worked extremely hard in a very few days to get to where we are on many amendments that we will be talking about in the coming days.
I also want to show my support for the noble Baroness, Lady Merron. I do not believe it is just a matter of the Interpretation Act; I believe this is a fundamental issue and I thank her for raising it, because it was not something that was immediately obvious. The fact is that a combination of characteristics is a particular risk in itself; it is not just about having several different characteristics. I hope the Minister reflects on this and can give a positive response. That will set us off on a very good course for the first day of Report.
(1 year, 4 months ago)
Lords ChamberMy Lords, as we enter the final stages of consideration of this Bill, it is a good time to focus a little more on what is likely to happen once it becomes law, and my Amendment 28 is very much in that context. We now have a very good idea of what the full set of obligations that in-scope services will have to comply with will look like, even if the detailed guidance is still to come.
With this amendment I want to return to the really important question that I do not believe we answered satisfactorily when we debated it in Committee. That is that there is a material risk that, without further amendment or clarification, Wikipedia and other similar services may feel that they can no longer operate in the United Kingdom.
Wikipedia has already featured prominently in our debates, but there are other major services that might find themselves in a similar position. As I was discussing the definitions in the Bill with my children yesterday—this may seem an unusual dinner conversation with teenagers, but I find mine to be a very useful sounding board—they flagged that OpenStreetMap, to which we all contribute, also seems to be in the scope of how we have defined user-to-user services. I shall start by asking some specific questions so that the Minister has time to find the answers in his briefing or have them magically delivered to him before summing up: I shall ask the questions and then go on to make the argument.
First, is it the Government’s view that Wikipedia and OpenStreetMap fall within the definition of user-to-user services as defined in Clause 2 and the content definition in Clause 211? We need to put all these pieces together to understand the scope. I have chosen these services because each is used by millions of people in the UK and their functionality is very well known, so I trust that the Government had them in mind when they were drafting the legislation, as well as the more obvious services such as Instagram, Facebook et cetera.
Secondly, can the Minister confirm whether any of the existing exemptions in the Bill would apply to Wikipedia and OpenStreetMap such that they would not have to comply with the obligations of a category 1 or 2B user-to-user service?
Thirdly, does the Minister believe that the Bill as drafted allows Ofcom to use its discretion in any other way to exempt Wikipedia and OpenStreetMap, for example through the categorisation regulations in Schedule 11? As a spoiler alert, I expect the answers to be “Yes”, “No” and “Maybe”, but it is really important that we have the definitive government response on the record. My amendment would seek to turn that to “Yes”, “Yes” and therefore the third would be unnecessary because we would have created an exemption.
The reason we need to do this is not in any way to detract from the regulation or undermine its intent but to avoid facing the loss of important services at some future date because of situations we could have avoided. This is not hyperbole or a threat on the part of the services; it is a natural consequence if we impose legal requirements on a responsible organisation that wants to comply with the law but knows it cannot meet them. I know it is not an intended outcome of the Bill that we should drive these services out, but it is certainly one intended outcome that we want other services that cannot meet their duties of care to exit the UK market rather than continue to operate here in defiance of the law and the regulator.
We should remind ourselves that at some point, likely to be towards the end of 2024, letters will start to arrive on the virtual doormats of all the services we have defined as being in scope—these 25,000 services—and their senior management will have a choice. I fully expect that the Metas, the Googles and all such providers will say, “Fine, we will comply. Ofcom has told us what we need to do, and we will do it”. There will be another bunch of services that will say, “Ofcom, who are they? I don’t care”, and the letter will go in the bin. We have a whole series of measures in the Bill by which we will start to make life difficult for them: we will disrupt their businesses and seek to prosecute them and we will shut them out of the market.
However, there is a third category, which is the one I am worried about in this amendment, who will say, “We want to comply, we are responsible, but as senior managers of this organisation”, or as directors of a non-profit foundation, “we cannot accept the risk of non-compliance and we do not have the resources to comply. There is no way that we can build an appeals mechanism, user reporter functions and all these things we never thought we would need to have”. If you are Wikipedia or OpenStreetMap, you do not need to have that infrastructure, yet as I read the Bill, if they are in scope and there is no exemption, then they are going to be required to build all that additional infrastructure.
The Bill already recognises that there are certain classes of services where it would be inappropriate to apply this new regulatory regime, and it describes these in Schedule 1, which I am seeking to amend. My amendment just seeks to add a further class of exempted service and it does this quite carefully so that we would exclude only services that I believe most of us in this House would agree should not be in scope. There are three tests that would be applied.
The first is a limited functionality test—we already have something similar in Schedule 1—so that the user-to-user functions are only those that relate to the production of what I would call a public information resource. In other words, users engage with one another to debate a Wikipedia entry or a particular entry on a map on OpenStreetMap. So, there is limited user-to-user functionality all about this public interest resource. They are not user-to-user services in the classic sense of social media; they are a particular kind of collective endeavour. These are much closer to newspaper publishers, which we have explicitly excluded from the Bill. It is much more like a newspaper; it just happens to be created by users collectively, out of good will, rather than by paid professional journalists. They are very close to that definition, but if you read Schedule 1, I do not think the definition of “provider content” in paragraph 4(2) includes at the moment these collective-user endeavours, so they do not currently have the exemption.
I have also proposed that Ofcom would carry out a harm test to avoid the situation where someone argues that their services are a public information resource, while in practice using it to distribute harmful material. That would be a rare case, but noble Lords can conceive of it happening. Ofcom would have the ability to say that it recognises that Wikipedia does not carry harmful content in any meaningful way, but it would also have the right not to grant the exemption to service B that says it is a new Wikipedia but carries harmful content.
Thirdly, I have suggested that this is limited to non-commercial services. There is an argument for saying any public information resource should benefit, and that may be more in line with the amendment proposed by the noble Lord, Lord Moylan, where it is defined in terms of being encyclopaedic or the nature of the service. I recognise that I have put in “non-commercial” as belt and braces because there is a rationale for saying that, while we do not really want an encyclopaedic resource to be in the 2B service if it has got user-to-user functions, if it is commercial, we could reasonably expect it to find some way to comply. It is different when it is entirely non-commercial and volunteer-led, not least because the Wikimedia Foundation, for example, would struggle to justify spending the money that it has collected from donors on compliance costs with the UK regime, whereas a commercial company could increase its resources from commercial customers to do that.
I hope this is a helpful start to a debate in which we will also consider Amendment 29, which has similar goals. I will close by asking the Minister some additional questions. I have asked him some very specific ones to which I hope he can provide answers, but first I ask: does he acknowledges the genuine risk that services like Wikipedia and OpenStreetMap could find themselves in a position where they have obligations under the Bill that they simply cannot comply with? It is not that they are unwilling, but there is no way for them to do all this structurally.
Secondly, I hope the Minister would agree that it is not in the public interest for Ofcom to spend significant time and effort on the oversight of services like these; rather, it should spend its time and effort on services, such as social media services, that we believe to be creating harms and are the central focus of the Bill.
Thirdly, will the Minister accept that there is something very uncomfortable about a government regulator interfering with the running of a neutral public resource like Wikipedia, when there is so much benefit from it and little or no demonstrative harm? It is much closer to the model that exists for a newspaper. We have debated endlessly in this House—and I am sure we will come back to it—that there is, rightly, considerable reluctance to have regulators going too far and creating this relationship with neutral public information goods. Wikipedia falls into that category, as does OpenStreetMap and others, and there would be fundamental in principle challenges around that.
I hope the Government will agree that we should be taking steps to make sure we are not inadvertently creating a situation where, in one or two years’ time, Ofcom will come back to us saying that it wrote to Wikipedia, because the law told it to do so, and told Wikipedia all the things that it had to do; Wikipedia took it to its senior management and then came back saying that it is shutting shop in the UK. Because it is sensible, Ofcom would come back and say that it did not want that and ask to change the law to give it the power to grant an exemption. If such things deserve an exemption, let us make it clear they should have it now, rather than lead ourselves down this path where we end up effectively creating churn and uncertainty around what is an extraordinarily valuable public resource. I beg to move.
My Lords, Amendments 29 and 30 stand in my name. I fully appreciated, as I prepared my thoughts ahead of this short speech, that a large part of what I was going to say might be rendered redundant by the noble Lord, Lord Allan of Hallam. I have not had a discussion with him about this group at all, but it is clear that his amendment is rather different from mine. Although it addresses the same problem, we are coming at it slightly differently. I actually support his amendment, and if the Government were to adopt it I think the situation would be greatly improved. I do prefer my own, and I think he put his finger on why to some extent: mine is a little broader. His relates specifically to public information, whereas mine relates more to what can be described as the public good. So mine can be broader than information services, and I have not limited it to non-commercial operations, although I fully appreciate that quite a lot of the services we are discussing are, in practice, non-commercial. As I say, if his amendment were to pass, I would be relatively satisfied, but I have a moderate preference for my own.
My Lords, I shall speak briefly to Amendment 174 in my name and then more broadly to this group—I note that the Minister got his defence in early.
On the question of misinformation and disinformation, I recognise what he said and I suppose that, in my delight at hearing the words “misinformation and disinformation”, I misunderstood to some degree what he was offering at the Dispatch Box, but I make the point that this poses an enormous risk to children. As an example, children are the fastest-growing group of far-right believers/activists online, and there are many areas in which we are going to see an exponential growth in misinformation and disinformation as large language models become the norm. So I ask him, in a tentative manner, to look at that.
On the other issue, I have to push back at the Minister’s explanation. Content classification around sexual content is a well-established norm. The BBFC does it and has done it for a very long time. There is an absolute understanding that what is suitable for a U, a PG, a 12 or a 12A are different things, and that as children’s capacities evolve, as they get older, there are things that are more suitable for older children, including, indeed, stronger portrayals of sexual behaviour as the age category rises. So I cannot accept that this opens a new can of worms: this is something that we have been doing for many, many years.
I think it is a bit wrongheaded to imagine that if we “solve” the porn problem, we have solved the problem—because there is still sexualisation and the commercialisation of sex. Now, if you say something about feet to a child, they start to giggle uproariously because, in internet language, you get paid for taking pictures of feet and giving them to strange people. There are such detailed and different areas that companies should be looking at. This amendment in my name and the names of the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford, should be taken very seriously. It is not new ground, so I would ask the Minister to reconsider it.
More broadly, the Minister will have noticed that I liberally added my name to the amendments he has brought forward to meet some of the issues we raised in Committee, and I have not added my name to the schedule of harms. I want to be nuanced about this and say I am grateful to the Government for putting them in the Bill, I am grateful that the content harms have been discussed in this Chamber and not left for secondary legislation, and I am grateful for all the conversations around this. However, harm cannot be defined only as content, and the last grouping got to the core of the issue in the House. Even when the Minister was setting out this amendment, he acknowledged that the increase in harm to users may be systemic and by design. In his explanation, he used the word “harm”; in the Bill, it always manifests as “harmful content”.
While the systemic risk of increasing the presence of harmful content is consistently within the Bill, which is excellent, the concept that the design of service may in and of itself be harmful is absent. In failing to do that, the Government, and therefore the Bill, have missed the bull’s-eye. The bull’s-eye is what is particular about this method of communication that creates harm—and what is particular are the features, functionalities and design. I draw noble Lords back to the debate about Wikipedia. It is not that we all love Wikipedia adoringly; it is that it does not pursue a system of design for commercial purposes that entraps people within its grasp. Those are the harms we are trying to get at. I am grateful for the conversations I have had, and I look forward to some more. I have laid down some other amendments for Monday and beyond that would, I hope, deal with this—but until that time, I am afraid this is an incomplete picture.
My Lords, I have a comment about Amendment 174 in the name of the noble Baroness, Lady Kidron. I have no objection to the insertion of subsection (9B), but I am concerned about (9A), which deals with misinformation and disinformation. It is far too broad and political, and if we start at this late stage to try to run off into these essentially political categories, we are going to capsize the Bill altogether. So I took some heart from the fact that my noble friend on the Front Bench appeared disinclined to accept at least that limb of the amendment.
I did want to ask briefly some more detailed questions about Amendment 172 and new subsection (2) in particular. This arises from the danger of having clauses added at late stages of the Bill that have not had the benefit of proper discussion and scrutiny in Committee. I think we are all going to recognise the characteristics that are listed in new subsection (2) as mapping on to the Equality Act, which appears to be their source. I note in passing that it refers in that regard to gender reassignment. I would also note that most of the platforms, in their terms and conditions, refer not to gender reassignment but to various other things such as gender identity, which are really very different, or at least different in detail, and I would be interested to ask my noble friend how effectively he expects it to be enforced that the words used in English statute are actually applied by these foreign platforms—I am going to come back to this in a further amendment later—or how the words used in English statute are applied by what are, essentially, foreign platforms when they are operating for an audience in the United Kingdom.
I take this opportunity to ask my noble friend the Minister a question; I want some clarity about this. Would an abusive comment about a particular religion—let us say a religion that practised cannibalism or a historical religion that sacrificed babies, as we know was the norm in Carthage—count as “priority harmful content”? I appreciate that we are mapping the language of the Equality Act, but are we creating a new offence of blasphemy in this Bill?
As was pointed out by others in the debate, the key provision in Amendment 172 is subsection (2) of the proposed new clause, which relates to:
“Content which is abusive and which targets any of the following characteristics”.
It must both be abusive and target the listed characteristics. It does not preclude legitimate debate about those things, but if it were abusive on the basis of those characteristics—rather akin to the debate we had in the previous group and the points raised by the noble Baroness, Lady Kennedy of The Shaws, about people making oblique threats, rather than targeting a particular person, by saying, “People of your characteristic should be abused in the following way”—it would be captured.
My noble friend seemed to confirm what I said. If I wish to be abusive—in fact, I do wish to be abusive—about the Carthaginian religious practice of sacrificing babies to Moloch, and I were to do that in a way that came to the attention of children, would I be caught as having created “priority harmful content”? My noble friend appears to be saying yes.
Does my noble friend wish to do that and direct it at children?
With respect, it does not say “directed at children”. Of course, I am safe in expressing that abuse in this forum, but if I were to do it, it came to the attention of children and it were abusive—because I do wish to be abusive about that practice—would I have created “priority harmful content”, about which action would have to be taken?
I will leap to the Minister’s defence on this occasion. I remind noble colleagues that this is not about individual pieces of content; there would have to be a consistent flow of such information being proffered to children before Ofcom would ask for a change.
My Lords, these words have obviously appeared in the Bill in one of those unverified sections; I have clicked the wrong button, so I cannot see them. Where does it say in Amendment 172 that it has to be a consistent flow?
May I attempt to assist the Minister? This is the “amber” point described by the noble Lord, Lord Allan: “priority content” is not the same as “primary priority content”. Priority content is our amber light. Even the most erudite and scholarly description of baby eating is not appropriate for five year-olds. We do not let it go into “Bod” or any of the other of the programmes we all grew up on. This is about an amber warning: that user-to-user services must have processes that enable them to assess the risk of priority content and primary priority content. It is not black and white, as my noble friend is suggesting; it is genuinely amber.
(1 year, 4 months ago)
Lords ChamberMy Lords, as the noble Lord, Lord Clement-Jones, said, this is a very broad group, so I hope noble Lords will forgive me if I do not comment on every amendment in it. However, I have a great deal of sympathy for the case put forward by my noble friend Lady Buscombe and my noble and learned friend Lord Garnier. The addition of the word “financial” to Clause 160 is not only merited on the case made but is a practical and feasible thing to do in a way that the current inclusion of the phrase “non-trivial psychological” is not. After all, a financial loss can be measured and we know how it stands. I will also say that I have a great deal of sympathy with what the noble Lord, Lord Clement-Jones, said about his amendment. In so far as I understand them—I appreciate that they have not yet been spoken to—I am also sympathetic to the amendments in the names of the noble Baroness, Lady Kennedy of The Shaws, and the noble Lord, Lord Allan of Hallam.
I turn to my Amendment 265, which removes the word “psychological” from this clause. We have debated this already, in relation to other amendments, so I am going to be fairly brief about it. Probably through an oversight of mine, this amendment has wandered into the wrong group. I am going to say simply that it is still a very, very good idea and I hope that my noble friend, when he comes to reflect on your Lordships’ Committee as a whole, will take that into account and respond appropriately. Instead, I am going to focus my remarks on the two notices I have given about whether Clauses 160 and 161 should stand part of the Bill; Clause 161 is merely consequential on Clause 160, so the meat is whether Clause 160 should stand part of the Bill.
I was a curious child, and when I was learning the Ten Commandments—I am sorry to see the right reverend Prelate has left because I hoped to impress him with this—I was very curious as to why they were all sins, but some of them were crimes and others were not. I could not quite work out why this was; murder is a crime but lying is not a crime—and I am not sure that at that stage I understood what adultery was. In fact, lying can be a crime, of course, if you undertake deception with intent to defraud, and if you impersonate a policeman, you are lying and committing a crime, as I understand it—there are better-qualified noble Lords than me to comment on that. However, lying in general has never been a crime, until we get to this Bill, because for the first time this Bill makes lying in general—that is, the making of statements you know to be false—a crime. Admittedly, it is a crime dependent on the mode of transmission: it has to be online. It will not be a crime if I simply tell a lie to my noble and learned friend Lord Garnier, for example, but if I do it online, any form of statement which is not true, and I know not to be true, becomes a criminal act. This is really unprecedented and has a potentially chilling effect on free speech. It certainly seems to be right that, in your Lordships’ Committee, the Government should be called to explain what they think they are doing, because this is a very portentous matter.
The Bill states that a person commits the false communications offence if they send a message that they know to be false, if they intend the message to cause a degree of harm of a non-trivial psychological or physical character, and if they have no reasonable excuse for sending the message. Free speech requires that one should be allowed to make false statements, so this needs to be justified. The wording of the offence raises substantial practical issues. How is a court meant to judge what a person knows to be false? How is a committee of the House of Commons meant to judge, uncontroversially, what a person knows to be false at the time they say it? I say again: what is non-trivial psychological harm and what constitutes an excuse? None of these things is actually defined; please do not tell me they are going to be defined by Ofcom—I would not like to hear that. This can lead to astonishing inconsistency in the courts and the misapplication of criminal penalties against people who are expressing views as they might well be entitled to do.
Then there is the question of the audience, because the likely audience is not just the person to whom the false statement is directed but could be anybody who subsequently encounters the message. How on earth is one going to have any control over how that message travels through the byways and highways of the online world and be able to say that one had some sense of who it was going to reach and what non-trivial psychological harm it might cause when it reached them?
We are talking about this as if this criminal matter is going to be dealt with by the courts. What makes this whole clause even more disturbing is that in the vast majority of cases, these offences will never reach the courts, because there is going to be, inevitably, an interaction with the illegal content duties in the Bill. By definition, these statements will be illegal content, and the platforms have obligations under the Bill to remove and take down illegal content when they become aware of it. So, the platform is going to have to make some sort of decision about not only the truth of the statement but whether the person knows what the statement is, that the statement is false and what their intention is. Under the existing definition of illegal content, they will be required to remove anything they reasonably believe is likely to be false and to prevent it spreading further, because the consequences of it, in terms of the harm it might do, are incalculable by them at that point.
We are placing a huge power of censorship—and mandating it—on to the platforms, which is one of the things that some of us in this Committee have been very keen to resist. Just exploring those few points, I think my noble friend really has to explain what he thinks this clause is doing, how it is operable and what its consequences are going to be for free speech and censorship. As it stands, it seems to me unworkable and dangerous.
Does my noble friend agree with me that our courts are constantly looking into the state of mind of individuals to see whether they are lying? They look at what they have said, what they have done and what they know. They can draw an inference based on the evidence in front of them about whether the person is dishonest. This is the daily bread and butter of court. I appreciate the points he is making but, if I may say so, he needs to dial back slightly his apoplexy. Underlying this is a case to be made in justice to protect the innocent.
I did not say that it would be impossible for a court to do this; I said it was likely to lead to high levels of inconsistency. We are dealing with what is likely to be very specialist cases. You can imagine this in the context of people feeling non-trivially psychologically harmed by statements about gender, climate, veganism, and so forth. These are the things where you see this happening. The idea that there is going to be consistency across the courts in dealing with these issues is, I think, very unlikely. It will indeed have a chilling effect on people being able to express views that may be controversial but are still valid in an open society.
My Lords, I want to reflect on the comments that the noble Lord, Lord Moylan, has just put to us. I also have two amendments in the group; they are amendments to the government amendment, and I am looking to the Minister to indicate whether it is helpful for me to explain the rationale of my amendments now or to wait until he has introduced his. I will do them collectively.
First, the point the noble Lord, Lord Moylan, raised is really important. We have reached the end of our consideration of the Bill; we have spent a lot of time on a lot of different issues, but we have not spent very much time on these new criminal offences, and there may be other Members of your Lordships’ House who were also present when we discussed the Communications Act back in 2003, when I was a Member at the other end. At that point, we approved something called Section 127, which we were told was essentially a rollover of the dirty phone call legislation we had had previously, which had been in telecoms legislation for ever to prevent that deep-breathing phone call thing.
It needs to be addressed, because these very small websites already alluded to are providing some extremely nasty stuff. They are not providing support to people and helping decrease the amount of harm to those self-harming but seem to be enjoying the spectacle of it. We need to differentiate and make sure that we do not inadvertently let one group get away with disseminating very harmful material simply because it has a small website somewhere else. I hope that will be included in the Minister’s letter; I do not expect him to reply now.
Some of us are slightly disappointed that my noble friend did not respond to my point on the interaction of Clause 160 with the illegal content duty. Essentially, what appears to be creating a criminal offence could simply be a channel for hyperactive censorship on the part of the platforms to prevent the criminal offence taking place. He has not explained that interaction. He may say that there is no interaction and that we would not expect the platforms to take any action against offences under Clause 160, or that we expect a large amount of action, but nothing was said.
If my noble friend will forgive me, I had better refresh my memory of what he said—it was some time ago—and follow up in writing.
(1 year, 5 months ago)
Lords ChamberMy Lords, I shall speak to Amendments 59, 107 and 264 in this group, all of which are in my name. Like the noble Baroness, Lady Merron, I express gratitude to Full Fact for its advice and support in preparing them.
My noble friend Lord Bethell has just reminded us of the very large degree of discretion that is given to platforms by the legislation in how they respond to information that we might all agree, or might not agree, is harmful, misinformation or disinformation. We all agree that those categories exist. We might disagree about what falls into them, but we all agree that the categories exist, and the discretion given to the providers in how to handle it is large. My amendments do not deal specifically with health-related misinformation or disinformation but are broader.
The first two, Amendments 59 and 107—I am grateful to my noble friend Lord Strathcarron for his support of Amendment 59—try to probe what the Government think platforms should do when harmful material, misinformation and disinformation appear on their platforms. As things stand, the Government require that the platforms should decide what content is not allowed on their platforms; then they should display this in their terms of service; and they should apply a consistent approach in how they manage content that is in breach of their terms of service. The only requirement is for consistency. I have no objection to their being required to behave consistently, but that is the principal requirement.
What Amendments 59 and 107 do—they have similar effects in different parts of the Bill; one directly on the platforms; the other in relation to codes of practice—is require them also to act proportionately. Here, it might be worth articulating briefly the fact that there are two views about platforms and how they respond, both legitimate. One is that some noble Lords may fear that platforms will not respond at all: in other words, they will leave harmful material on their site and will not properly respond.
The other fear, which is what I want to emphasise, is that platforms will be overzealous in removing material, because they will have written their terms of service, as I said on a previous day in Committee, not only for their commercial advantage but also for their legal advantage. They will have wanted to give themselves a wide latitude to remove material, or to close accounts, because that will help cover their backs legally. Of course, once they have granted themselves those powers, the fear is that they will use them overzealously, even in cases where that would be an overreaction. These two amendments seek to oblige the platforms to respond proportionately, to consider alternative approaches to cancellation and removal of accounts and to be obliged to look at those as well.
There are alternative approaches that they could consider. Some companies already set out to promote good information, if you like, and indeed we saw that in the Covid-19 pandemic. My noble friend Lord Bethell said that they did so, and they did so voluntarily. This amendment would not explicitly but implicitly encourage that sort of behaviour as a first resort, rather than cancellation, blocking and removal of material as a first resort. They would still have the powers to cancel, block and remove; it is a question of priority and proportionality.
There are also labels that providers can put on material that they think is dubious, saying, “Be careful before you read this”, or before you retweet it; “This is dubious material”. Those practices should also be encouraged. These amendments are intended to do that, but they are intended, first and foremost, to probe what the Government’s attitude is to this, whether they believe they have any role in giving guidance on this point and how they are going to do so, whether through legislation or in some other way, because many of us would like to know.
Amendment 264, supported by my noble friend Lord Strathcarron and the noble Lord, Lord Clement-Jones, deals with quite a different matter, although it falls under the general category of misinformation and disinformation: the role the Government take directly in seeking to correct misinformation and disinformation on the internet. We know that No. 10 has a unit with this explicit purpose and that during the Covid pandemic it deployed military resources to assist it in doing so. Nothing in this amendment would prevent that continuing; nothing in it is intended to create scare stories in people’s minds about an overweening Government manipulating us. It is intended to bring transparency to that process.
The Minister mentioned “acute” examples of misinformation and used the example of the pandemic. I tried to illustrate that perhaps, with hindsight, what were seen as acute examples of misinformation turned out to be rather more accurate than we were led to believe at the time. So my concern is that there is already an atmosphere of scepticism about official opinion, which is not the same as misinformation, as it is sometimes presented. I used the American example of the Hunter Biden laptop so we could take a step away.
This might be an appropriate moment for me to say—on the back of that—that, although my noble friend explained current government practice, he has not addressed my point on why there should not be an annual report to Parliament that describes what government has done on these various fronts. If the Government regularly meet newspaper publishers to discuss the quality of information in their newspapers, I for one would have entire confidence that the Government were doing so in the public interest, but I would still quite like—I think the Government would agree on this—a report on what was happening, making an exception for national security. That would still be a good thing to do. Will my noble friend explain why we cannot be told?
While I am happy to elaborate on the work of the counter-disinformation unit in the way I just have, the Government cannot share operational details about its work, as that would give malign actors insight into the scope and scale of our capabilities. As my noble friend notes, this is not in the public interest. Moreover, reporting representations made to platforms by the unit would also be unnecessary as this would overlook both the existing processes that govern engagements with external parties and the new protections that are introduced through the Bill.
In the first intervention, the noble Baroness, Lady Fox, gave a number of examples, some of which are debatable, contestable facts. Companies may well choose to keep them on their platforms within their terms of service. We have also seen deliberate misinformation and disinformation during the pandemic, including from foreign actors promoting more harmful disinformation. It is right that we take action against this.
I hope that I have given noble Lords some reassurance on the points raised about the amendments in this group. I invite them not to press the amendments.