(1 year, 7 months ago)
Lords ChamberMy Lords, in moving my Amendment 13 I will speak to all the amendments in the group, all of which are in my name with the exception of Amendment 157 in the name of my noble friend Lord Pickles. These are interlinked amendments; they work together. There is effectively only one amendment going on. A noble Lord challenged me a day or two ago as to whether I could summarise in a sentence what the amendment does, and the answer is that I think I can: Clause 23 imposes various duties on search engines, and this amendment would remove one of those duties from search engines that fall into category 2B.
There are two categories of search engines, 2A and 2B, and category 2B is the smaller search engines. We do not know the difference between them in greater detail than that because the schedule that relates to them reserves to the Secretary of State the power to set the thresholds that will define which category a search engine falls into, but I think it is clear that category 2B is the smaller ones.
These amendments pursue a theme that I brought up in Committee earlier in the week when I argued that the Bill would put excessively onerous and unnecessary obligations on smaller businesses. The particular duty that these amendments would take away from smaller search engines is referred to in Clause 23(2):
“A duty, in relation to a service, to take or use proportionate measures relating to the design or operation of the service to effectively mitigate and manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service”.
The purpose of that is to recognise that very large numbers of smaller businesses do not pose a risk, according to the Government’s own assessment of the market, and to allow them to get on with their business without taking these onerous and difficult measures. They are probing amendments to try to find out what the Government are willing to do in relation to smaller businesses that will make this a workable Bill.
I can already imagine that there are noble Lords in the Chamber who will say that small does not equal safe, and that small businesses need to be covered by the same rigorous regulations as larger businesses. But I am not saying that small equals safe. I am saying—as I attempted to say when the Committee met earlier—that absolute safety is not attainable. It is not attainable in the real world, nor can we expect it to be attainable in the online world. I imagine that objection will be made. I see it has some force, but I do not think it has sufficient compelling force to put the sort of burden on small businesses that this Bill would do, and I would like to hear more about it.
I will say one other thing. Those who object to this approach need to be sure in their own minds that they are not contributing to creating a piece of legislation that, when it comes into operation, is so difficult to implement that it becomes discredited. There needs to be a recognition that this has to work in practice. If it does not—if it creates resentment and opposition—we will find the Government not bringing sections of it into force, needing to repeal them or going easy on them once the blowback starts, so to speak. With that, I beg to move.
My Lords, I will speak to Amendment 157 in the name of the noble Lord, Lord Pickles, and others, since the noble Lord is unavoidably absent. It is along the same lines as Amendment 13; it is relatively minor and straightforward, and asks the Government to recognise that search services such as Google are enormously important as an entry to the internet. They are different from social media companies such as Twitter. We ask that the Government be consistent in applying their stated terms when these are breached in respect of harm to users, whether that be through algorithms, through auto-prompts or otherwise.
As noble Lords will be aware, the Bill treats user-to-user services, such as Meta, and search services, such as Google, differently. The so-called third shield or toggle proposed for shielding users from legal but harmful content, should they wish to be shielded, does not apply when it comes to search services, important though they are. Indeed, at present, large, traditional search services, including Google and Microsoft Bing, and voice search assistants, including Alexa and Siri, will be exempted from several of the requirements for large user-to-user services—category 1 companies. Why the discrepancy? Though search services rightly highlight that the content returned by a search is not created or published by them, the algorithmic indexing, promotion and search prompts provided in search bars—the systems they design and employ—are their responsibility, and these have been proven to do harm.
Some of the examples of such harm have already been cited in the other place, but not before this Committee. I do not want to give them too much of an airing because they were in the past, and the search people have taken them down after complaints, but some of the dreadful things that emerge from searching on Google et cetera are a warning of what could occur. It has been pointed out that search engines would in the past have thrown up, for example, swastikas, SS bolts and other Nazi memorabilia when people searched for desk ornaments. If George Soros’s name came up, he would be included in a list of people responsible for world evils. The Bing service, which I dislike anyway, has been directing people—at last, it did in the past—to anti-Semitic and homophobic searches through its auto-complete, while Google’s image carousel highlighted pictures of portable barbecues to those searching for the term “Jewish baby stroller”.
My Lords, I also support Amendment 157, which stands in the name of the noble Lord, Lord Pickles, and others, including my own. As the noble Baroness, Lady Deech, indicated, it is specific in the nature of what it concentrates on. The greatest concern that arises through the amendment is with reference to category 2A. It is not necessarily incompatible with what the noble Lord, Lord Moylan, proposes; I do not intend to make any direct further comment on his amendments. While the amendment is specific, it has a resonance with some of the other issues raised on the Bill.
I am sure that everyone within this Committee would want to have a Bill that is as fit for purpose as possible. The Bill was given widespread support at Second Reading, so there is a determination across the Chamber to have that. Where we can make improvements to the Bill, we should do that and, as much as possible, try to future-proof the Bill. The wider resonance is the concern that if the Bill is to be successful, we need as much consistency and clarity within it as possible, particularly for users. Where we have a level of false dichotomy of regulations, that runs contrary to the intended purposes of the Bill and creates inadvertent opportunities for loopholes. As such, and as has been indicated, the concern is that in the Bill at present, major search engines are effectively treated in some of the regulations on a different basis from face-to-face users. For example, some of the provisions around risk assessment, the third shield and the empowerment tools are different.
As also indicated, we are not talking about some of the minor search engines. We are talking about some of the largest companies in the world, be it Google, Microsoft through Bing, Amazon through its devices or Apple through its Siri voice tool, so it is reasonable that they are brought into line with what is there is for face-to-face users. The amendment is therefore appropriate and the rationale for it is that there is a real-world danger. Mention has been made—we do not want to dwell too long on some of the examples, but I will use just one—of the realms of anti-Semitism, where I have a particular interest. For example, on search tools, a while ago there was a prompt within one search engine that Jews are evil. It was found that when that prompt was there, searches of that nature increased by 10% and when it was removed, they were reduced. It is quite fixable and it goes into a wide range of areas.
One of the ways in which technology has changed, I think for us all, is the danger that it can be abused by people who seek to radicalise others and make them extreme, particularly young children. Gone are the days when some of these extremists or terrorists were lonely individuals in an attic, with no real contact with the outside world, or hanging around occasionally in the high street while handing out poorly produced A4 papers with their hateful ideology. There is a global interconnection here and, in particular, search engines and face-to-face users can be used to try to draw young people into their nefarious activities.
I mentioned the example of extremism and radicalisation when it comes to anti-Semitism. I have seen it from my own part of the world, where there is at times an attempt by those who still see violence as the way forward in Northern Ireland to draw new generations of young people into extremist ideology and terrorist acts. There is an attempt to lure in young people and, sadly, search engines have a role within that, which is why we need to see that level of protection. Now, the argument from search engines is that they should have some level of exemptions. How can they be held responsible for everything that appears through their searches, or indeed through the web? But in terms of content, the same argument could be used for face-to-face users. It is right, as the proposer of this amendment has indicated, that there are things such as algorithmic indexing and prompt searches where they do have a level of control.
The use of algorithms has moved on considerably since my schooldays, as they surely have for everyone in this Committee, and I suspect that none of us felt that they would be used in such a fashion. We need a level of protection through an amendment such as this and, as its proposers, we are not doctrinaire on the precise form in which this should take place. We look, for example, at the provisions within Clause 11—we seek to hear what the Government have to say on that—which could potentially be used to regulate search engines. Ensuring that that power is given, and will be used by Ofcom, will go a long way to addressing many of the concerns.
I think all of us in this Committee are keen to work together to find the right solutions, but we feel that there is a need to make some level of change to the regulations that are required for search engines. None of us in this Committee believes that we will ultimately have a piece of legislation that reflects perfection, but there is a solemn duty on us all to produce legislation that is as fit for purpose and future-proofed as possible, while providing children in particular with the maximum protection in what is at times an ever-changing and sometimes very frightening world.
My Lords, I agree in part with the noble Lord, Lord Moylan. I was the person who said that small was not safe, and I still feel that. I certainly do not think that anything in the Bill will make the world online 100% safe, and I think that very few noble Lords do, so it is important to say that. When we talk about creating a high bar or having zero tolerance, we are talking about ensuring that there is a ladder within the Bill so that the most extreme cases have the greatest force of law trying to attack them. I agree with the noble Lord on that.
I also absolutely agree with the noble Lord about implementation: if it is too complex and difficult, it will be unused and exploited in certain ways, and it will have a bad reputation. The only part of his amendment that I do not agree with is that we should look at size. Through the process of Committee, if we can look at risk rather than size, we will get somewhere. I share his impatience—or his inquiry—about what categories 2A and 2B mean. If category 2A means the most risky and category 2B means those that are less risky, I am with him all the way. We need to look into the definition of what they mean.
Finally, I mentioned several times on Tuesday that we need to look carefully at Ofcom’s risk profiles. Is this the answer to dealing with where risk gets determined, rather than size?
My Lords, I rise to speak along similar lines to the noble Baroness, Lady Kidron. I will address my noble friend Lord Moylan’s comments. I share his concern that we must not make the perfect the enemy of the good but, like the noble Baroness, I do not think that size is the key issue here, because of how tech businesses grow. Tech businesses are rather like building a skyscraper: if you get the foundations wrong, it is almost impossible to change how safe the building is as it goes up and up. As I said earlier this week, small tech businesses can become big very quickly, and, if you design your small tech business with the risks to children in mind at the very beginning, there is a much greater chance that your skyscraper will not wobble as it gets taller. On the other hand, if your small business begins by not taking children into account at all, it is almost impossible to address the problem once it is huge. I fear that this is the problem we face with today’s social media companies.
The noble Baroness, Lady Kidron, hit the nail on the head, as she so often does, in saying that we need to think about risk, rather than size, as the means of differentiating the proportionate response. In Clause 23, which my noble friend seeks to amend, the important phrase is “use proportionate measures” in subsection (2). Provided that we start with a risk assessment and companies are then under the obligation to make proportionate adjustments, that is how you build safe technology companies—it is just like how you build safe buildings.
My Lords, I will build on my noble friend’s comments. We have what I call the Andrew Tate problem. That famous pornographer and disreputable character started a business in a shed in Romania with a dozen employees. By most people’s assessment, it would have been considered a small business but, through his content of pornography and the physical assault of women, he extremely quickly built something that served an estimated 3 billion pages, and it has had a huge impact on the children of the English-speaking world. A small business became a big, nasty business very quickly. That anecdote reinforces the point that small does not mean safe, and, although I agree with many of my noble friend’s points, the lens of size is perhaps not the right one to look through.
My Lords, I did not want to interrupt the noble Lord, Lord Moylan, in full flow as he introduced the amendments, but I believe he made an error in terms of the categorisation. The error is entirely rational, because he took the logical position rather than the one in the Bill. It is a helpful error because it allows us to quiz the Minister on the rationale for the categorisation scheme.
As I read it, in Clause 86, the categories are: category 1, which is large user-to-user services; category 2A, which is search or combined services; and category 2B, which is small user-to-user services. To my boring and logical binary brain, I would expect it to be: “1A: large user-to-user”; “1B: small user-to-user”; “2A: large search”; and “2B: small search”. I am curious about why a scheme like that was not adopted and we have ended up with something quite complicated. It is not only that: we now have this Part 3/Part 5 thing. I feel that we will be confused for years to come: we will be deciding whether something is a Part 3 2B service or a Part 5 service, and we will end up with a soup of numbers and letters that do not conform to any normal, rational approach to the world.
My Lords, I welcome this debate, which revisits some of the areas discussed in earlier debates about the scope of the Bill, as many noble Lords said. It allows your Lordships’ House to consider what has to be the primary driver for assessment. In my view and as others said, it ought to be about risk, which has to be the absolute driver in all this. As the noble Baroness, Lady Harding, said, businesses do not remain static: they start at a certain size and then change. Of course, we hope that many of the businesses we are talking about will grow, so this is about preparation for growth and the reality of doing businesses.
As we discussed, there certainly are cases where search providers may, by their very nature, be almost immune from presenting users with content that could be considered either harmful or illegal under this legislative framework. The new clause proposed by the noble Lord, Lord Moylan—I am grateful to him for allowing us to explore these matters—and its various consequential amendments, would limit the duty to prevent access to illegal content to core category 2A search providers, rather than all search providers, as is currently the case under Clause 23(3).
The argument that I believe the noble Lord, Lord Moylan, put forward is that the illegal content duty is unduly wide, placing a disproportionate and otherwise unacceptable burden on smaller and/or supposedly safer search providers. He clearly said he was not saying that small was safe—that is now completely understood—but he also said that absolute safety is not achievable. As the noble Baroness, Lady Kidron, said, that is indeed so. If this legislation is too complex and creates the wrong provisions, we will clearly be a long way away from our ambition, which here has to be to have in place the best legislative framework, one that everyone can work with and that provides the maximum opportunity for safety and what we all seek to achieve.
Of course, the flip side of the argument about an unacceptable burden on smaller, or on supposedly safer, search providers may be that they would in fact have very little work to do to comply with the illegal content duty, at least in the short term. But the duty would act as an important safeguard, should the provider’s usual systems prove ineffective with the passage of time. Again, that point was emphasised in this and the previous debate by the noble Baroness, Lady Harding.
We look forward to the Minister’s response to find out which view he and his department subscribe to or, indeed, whether they have another view they can bring to your Lordships’ House. But, on the face of it, the current arrangements do not appear unacceptably onerous.
Amendment 157 in the name of the noble Lord, Lord Pickles, and introduced by the noble Baroness, Lady Deech, deals with search by a different approach by inserting requirements about search services’ publicly available statements into Clause 65. In the debate, the noble Baroness and the noble Lord, Lord Weir, raised very important, realistic examples of where search engines can take us, including to material that encourages racism directed at Jews and other groups and encourages hatred of various groups, including Jews. The amendment talks about issues such as the changing of algorithms or the hiding of content and the need to ensure that the terms of providers’ publicly available statements are applied as consistently.
I look forward to hearing from the Minister in response to Amendment 157 as the tech certainly moves us beyond questions of scope and towards discussion of the conduct of platforms when harm is identified.
My Lords, I must first apologise for my slightly dishevelled appearance as I managed to spill coffee down my shirt on my way to the Chamber. I apologise for that—as the fumes from the dried coffee suffuse the air around me. It will certainly keep me caffeinated for the day ahead.
Search services play a critical role in users’ online experience, allowing them easily to find and access a broad range of information online. Their gateway function, as we have discussed previously, means that they also play an important role in keeping users safe online because they have significant influence over the content people encounter. The Bill therefore imposes stringent requirements on search services to tackle the risks from illegal content and to protect children.
Amendments 13, 15, 66 to 69 and 73 tabled by my noble friend Lord Moylan seek to narrow the scope of the Bill so that its safety search duties apply only to the largest search services—categorised in the Bill as category 2A services—rather than to all search services. Narrowing the scope in this way would have an adverse impact on the safety of people using search services, including children. Search services, including combined services, below the category 2A threshold would no longer have a duty to minimise the risk of users encountering illegal content or children encountering harmful content in or via search results. This would increase the likelihood of users, including children, accessing illegal content and children accessing harmful content through these services.
The Bill already takes a targeted approach and the duties on search services will be proportionate to the risk of harm and the capacity of companies. This means that services which are smaller and lower-risk will have a lighter regulatory burden than those which are larger and higher-risk. All search services will be required to conduct regular illegal content risk assessments and, where relevant, children’s risk assessments, and then implement proportionate mitigations to protect users, including children. Ofcom will set out in its codes of practice specific steps search services can take to ensure compliance and must ensure that these are proportionate to the size and capacity of the service.
The noble Baroness, Lady Kidron, and my noble friend Lady Harding of Winscombe asked how search services should conduct their risk assessments. Regulated search services will have a duty to conduct regular illegal content risk assessments, and where a service is likely to be accessed by children it will have a duty to conduct regular children’s risk assessments, as I say. They will be required to assess the level and nature of the risk of individuals encountering illegal content on their service, to implement proportionate mitigations to protect people from illegal content, and to monitor them for effectiveness. Services likely to be accessed by children will also be required to assess the nature and level of risk of their service specifically for children to identify and implement proportionate mitigations to keep children safe, and to monitor them for effectiveness as well.
Companies will also need to assess how the design and operation of the service may increase or reduce the risks identified and Ofcom will have a duty to issue guidance to assist providers in carrying out their risk assessments. That will ensure that providers have, for instance, sufficient clarity about what an appropriate risk assessment looks like for their type of service.
The noble Lord, Lord Allan, and others asked about definitions and I congratulate noble Lords on avoiding the obvious
“To be, or not to be”
pun in the debate we have just had. The noble Lord, Lord Allan, is right in the definition he set out. On the rationale for it, it is simply that we have designated as category 1 the largest and riskiest services and as category 2 the smaller and less risky ones, splitting them between 2A, search services, and 2B, user-to-user services. We think that is a clear framework. The definitions are set out a bit more in the Explanatory Notes but that is the rationale.
I am grateful to the Minister for that clarification. I take it then that the Government’s working assumption is that all search services, including the biggest ones, are by definition less risky than the larger user-to-user services. It is just a clarification that that is their thinking that has informed this.
As I said, the largest and riskiest sites may involve some which have search functions, so the test of large and most risky applies. Smaller and less risky search services are captured in category 2A.
Amendment 157 in the name of my noble friend Lord Pickles, and spoken to by the noble Baroness, Lady Deech, seeks to apply new duties on the largest search services. I agree with the objectives in my noble friend’s amendment of increasing transparency about the search services’ operations and enabling users to hold them to account. It is not, however, an amendment I can accept because it would duplicate existing duties while imposing new duties which we do not think are appropriate for search services.
As I say, the Bill will already require search services to set out how they are fulfilling their illegal content and child safety duties in publicly available statements. The largest search services—category 2A—will also be obliged to publish a summary of their risk assessments and to share this with Ofcom. That will ensure that users know what to expect on those search services. In addition, they will be subject to the Bill’s requirements relating to user reporting and redress. These will ensure that search services put in place effective and accessible mechanisms for users to report illegal content and content which is harmful to children.
My noble friend’s amendment would ensure that the requirements to comply with its publicly available statements applied to all actions taken by a search service to prevent harm, not just those relating to illegal content and child safety. This would be a significant expansion of the duties, resulting in Ofcom overseeing how search services treat legal content which is accessed by adults. That runs counter to the Government’s stated desire to avoid labelling legal content which is accessed by adults as harmful. It is for adult users themselves to determine what legal content they consider harmful. It is not for us to put in place measures which could limit their access to legal content, however distasteful. That is not to say, of course, that where material becomes illegal in its nature that we do not share the determination of the noble Baroness, my noble friend and others to make sure that it is properly tackled. The Secretary of State and Ministers have had extensive meetings with groups making representations on this point and I am very happy to continue speaking to my noble friend, the noble Baroness and others if they would welcome it.
I hope that that provides enough reassurance for the amendment to be withdrawn at this stage.
My Lords, I am grateful to all noble Lords who have spoken in this debate. I hope that the noble Baroness, Lady Deech, and the noble Lord, Lord Weir of Ballyholme, will forgive me if I do not comment on the amendment they spoke to in the name of my noble friend Lord Pickles, except to say that of course they made their case very well.
I will briefly comment on the remarks of the noble Baroness, Lady Kidron. I am glad to see a degree of common ground among us in terms of definitions and so forth—a small piece of common ground that we could perhaps expand in the course of the many days we are going to be locked up together in your Lordships’ House.
I am grateful too to the noble Lord, Lord Allan of Hallam. I am less clear on “2B or not 2B”, if that is the correct way of referring to this conundrum, than I was before. The noble Baroness, Lady Kidron, said that size does not matter and that it is all about risk, but my noble friend the Minister cunningly conflated the two and said at various points “the largest” and “the riskiest”. I do not see why the largest are necessarily the riskiest. On the whole, if I go to Marks & Spencer as opposed to going to a corner shop, I might expect rather less risk. I do not see why the two run together.
I address the question of size in my amendment because that is what the Bill focuses on. I gather that the noble Baroness, Lady Kidron, may want to explore at some stage in Committee why that is the case and whether a risk threshold might be better than a size threshold. If she does that, I will be very interested in following and maybe even contributing to that debate. However, at the moment, I do not think that any of us is terribly satisfied with conflating the two—that is the least satisfactory way of explaining and justifying the structure of the Bill.
On the remarks of my noble friend Lady Harding of Winscombe, I do not want in the slightest to sound as if there is any significant disagreement between us—but there is. She suggested that I was opening the way to businesses building business models “not taking children into account at all”. My amendment is much more modest than that. There are two ways of dealing with harm in any aspect of life. One is to wait for it to arrive and then to address it as it arises; the other is constantly to look out for it in advance and to try to prevent it arising. The amendment would leave fully in place the obligation to remove harm, which is priority illegal content or other illegal content, that the provider knows about, having been alerted to it by another person or become aware of it in any other way. That duty would remain. The duty that is removed, especially from small businesses—and really this is quite important—is the obligation constantly to be looking out for harm, because it involves a very large, and I suggest possibly ruinous, commitment to constant monitoring of what appears on a search engine. That is potentially prohibitive, and it arises in other contexts in the Bill as well.
There should be occasions when we can say that knowing that harmful stuff will be removed as soon as it appears, or very quickly afterwards, is adequate for our purposes, without requiring firms to go through a constant monitoring or risk-assessment process. The risk assessment would have to be adjudicated by Ofcom, I gather. Even if no risk was found, of course, that would not be the end of the matter, because I am sure that Ofcom would, very sensibly, require an annual renewal of that application, or after a certain period, to make sure that things had not changed. So even to escape the burden is quite a large burden for small businesses, and then to implement the burden is so onerous that it could be ruinous, whereas taking stuff down when it appears is much easier to do.
Perhaps I might briefly come in. My noble friend Lord Moylan may have helped explain why we disagree: our definition of harm is very different. I am most concerned that we address the cumulative harms that online services, both user-to-user services and search, are capable of inflicting. That requires us to focus on the design of the service, which we need to do at the beginning, rather than the simpler harm that my noble friend is addressing, which is specific harmful content—not in the sense in which “content” is used in the Bill but “content” as in common parlance; that is, a piece of visual or audio content. My noble friend makes the valid point that that is the simplest way to focus on removing specific pieces of video or text; I am more concerned that we should not exclude small businesses from designing and developing their services such that they do not consider the broader set of harms that are possible and that add up to the cumulative harm that we see our children suffering from today.
So I think our reason for disagreement is that we are focusing on a different harm, rather than that we violently disagree. I agree with my noble friend that I do not want complex bureaucratic processes imposed on small businesses; they need to design their services when they are small, which makes it simpler and easier for them to monitor harm as they grow, rather than waiting until they have grown. That is because the backwards re-engineering of a technology stack is nigh-on impossible.
My noble friend makes a very interesting point, and there is much to ponder in it—too much to ponder for me to respond to it immediately. Since I am confident that the issue is going to arise again during our sitting in Committee, I shall allow myself the time to reflect on it and come back later.
While I understand my noble friend’s concern about children, the clause that I propose to remove is not specific to children; it relates to individuals, so it covers adults as well. I think I understand what my noble friend is trying to achieve—I shall reflect on it—but this Bill and the clauses we are discussing are a very blunt way of going at it and probably need more refinement even than the amendments we have seen tabled so far. But that is for her to consider.
I think this debate has been very valuable. I did not mention it, but I am grateful also for the contribution from the noble Baroness, Lady Merron. I beg leave to withdraw the amendment.
Having listened to the Minister, I think we need clarification on the issue of duplication and what is illegal as opposed to just harmful. If we can clarify that, I shall not move my Amendment 157.
When we come to Amendment 157, that will be noted.
Amendments 13A to 13C
My Lords, I propose Amendment 14 on behalf of my noble friend Lord Clement-Jones and the noble Lord, Lord Hunt of Kings Heath, who are not able to be present today due to prior commitments. I notice that the amendment has been signed also by the noble Baroness, Lady Fox, who I am sure will speak to it herself. I shall speak to the group of amendments as a whole.
I shall need to speak at some length to this group, as it covers some quite complex issues, even for this Bill, but I hope that the Committee will agree that this is appropriate given the amendments’ importance. I also expect that this is one area where noble Lords are receiving the most lobbying from different directions, so we should do it justice in our Committee.
We should start with a short summary of the concern that lies behind the amendments: that the Bill, as drafted, particularly under Clause 110, grants Ofcom the power to issue technical notices to online services that could, either explicitly or implicitly, require them to remove privacy protections—and, in particular, that this could undermine a technology that is increasingly being deployed on private messaging services called end-to-end encryption. The amendments in this group use various mechanisms to reduce the likelihood of that being an outcome. Amendments 14 and 108 seek to make it clear in the Bill that end-to-end encryption would be out of scope—and, as I understand it, Amendment 205, tabled by the noble Lord, Lord Moylan, seeks to do something similar.
A second set of amendments would add in extra controls over the issuing of technical notices. While not explicitly saying that these could not target E2EE—if noble Lords will excuse the double negative—they would make it less likely by ensuring that there is more scrutiny. They include a whole series of amendments—Amendments 202 and 206, tabled by the noble Lord, Lord Stevenson, and Amendment 207—that have the effect of ensuring that there is more scrutiny and input into issuing such a notice.
The third set of amendments aim to ensure that Ofcom gives weight more generally to privacy and to all the actions it takes in relation to it. In particular, Amendment 190 talks about a broader privacy duty, and Amendment 285—which I think noble Lord, Lord Moylan, will be excited about—seeks to restrict general monitoring.
I will now dig into why this is important. Put simply, there is a risk that under the Bill a range of internet services will feel that they are unable to offer their products in the UK. This speaks to a larger question as we debate the measures in the Bill, as it can sometimes feel as though we are comfortable ratcheting up the requirements in the Bill under the assumption that services will have no choice but to meet them and carry on. While online services will not have a choice about complying if they wish to be lawfully present in the UK, they will be free to exit the market altogether if they believe that the requirements are excessively onerous or impossible to meet.
In the Bill, we are constructing, in effect, a de facto licensing mechanism, where Ofcom will contact in-scope services—the category 2A, category 2B, Part 3 and Part 5 services we discussed in relation to the previous group of amendments—will order them to follow all the relevant regulation and guidance and will instruct them to pay a fee for that supervision. We have to consider that some services, on receipt of that notice, will take steps to restrict access by people in the UK rather than agree to such a licence. Where those are rogue services, this reaction is consistent with the aims of the Bill. We do not want services which are careless about online safety to be present in the UK market. But I do not believe that it is our aim to force mainstream services out of the UK market and, if there is a chance of that happening, it should give us pause for thought.
As a general rule, I am not given to apocalyptic warnings, but I believe there is a real risk that some of the concerns that noble Lords will be receiving in their inboxes are genuine, so I want to unpick why that may be the case. We should reflect for a moment on the assumptions we may have about the people involved in this debate and their motivations. We often see tech people characterised as oblivious to harms, and security services people as uncaring about human rights. In my experience, both caricatures are off the mark, as tech people hate to see their services abused and security service representatives understand that they need to be careful about how they exercise the great powers we have given them. We should note that, much of the time, those two communities work well together in spaces such the Global Internet Forum to Counter Terrorism.
If this characterisation is accurate, why do I think we may have a breakdown over the specific technology of end-to-end encryption? To understand this subject, we need to spend a few moments looking at trends in technology and regulation over recent years. First, we can look at the growth of content-scanning tools, which I think may have been in the Government’s mind when they framed and drafted the new Clause 110 notices. As social media services developed, they had to consider the risks of hosting content on the services that users had uploaded. That content could be illegal in all sorts of ways, including serious forms, such as child sexual abuse material and terrorist threats, as well as things such as copyright infringement, defamatory remarks and so on. Platforms have strong incentives to keep that material off their servers for both moral and legal reasons, so they began to develop and deploy a range of tools to identify and remove it. As a minimum, most large platforms now deploy systems to capture child sexual abuse material and copyright-infringing material, using technologies such as PhotoDNA and Audible Magic.
My Lords, I support Amendment 190 in the name of the noble Lord, Lord Clement-Jones, and Amendment 285 in the name of the noble Lord, Lord Stevenson. That is not to say that I do not have a great deal of sympathy for the incredibly detailed and expert speech we have just heard, but I want to say just a couple of things.
First, I think we need to have a new conversation about privacy in general. The privacy that is imagined by one community is between the state and the individual, and the privacy that we do not have is between individuals and the commercial companies. We live in a 3D world and the argument remains 2D. We cannot do that today, but I agree with the noble Lord that many in the enforcement community do have one hand on human rights, and many in the tech world do care about human rights. However, I do not believe that the tech sector has fully fessed up to its role and the contribution it could make around privacy. I hope that, as part of the debate on the Bill, and the debate that we will have subsequently on the data Bill No. 2, we come to untangle some of the things that they defend—in my view, unnecessarily and unfairly.
I point out that one of the benefits of end-to-end encryption is that it precisely stops companies doing things such as targeted advertising based on the content of people’s communications. Again, I think there is a very strong and correct trend to push companies in that direction.
I thank the noble Lord for the intervention. For those noble Lords who are not following the numbers, Amendment 285, which I support, would prevent general monitoring. Apart from anything else, I am worried about equivalence and other issues in relation to general monitoring. Apart from a principled position against it, I think to be explicit is helpful.
Ofcom needs to be very careful, and that is what Amendment 190 sets out. It asks whether the alternatives have been thought about, whether the conditions have been thought about, and whether the potential impact has been thought about. That series of questions is essential. I am probably closer to the community that wants to see more powers and more interventions, but I would like that to be in a very monitored and regulated form.
I thank the noble Lord for his contribution. Some of these amendments must be supported because it is worrying for us as a country to have—what did the noble Lord call it?—ambiguity about whether something is possible. I do not think that is a useful ambiguity.
My Lords, my name is attached to Amendment 203 in this group, along with those of the noble Lords, Lord Clement-Jones, Lord Strathcarron and Lord Moylan. I shall speak in general terms about the nature of the group, because it is most usefully addressed through the fundamental issues that arise. I sincerely thank the noble Lord, Lord Allan, for his careful and comprehensive introduction to the group, which gave us a strong foundation. I have crossed out large amounts of what I had written down and will try not to repeat, but rather pick up some points and angles that I think need to be raised.
As was alluded to by the noble Baroness, Lady Kidron, this debate and the range of these amendments shows that the Bill is currently extremely deficient and unclear in this area. It falls to this Committee to get some clarity and cut-through to see where we could end up and change where we are now.
I start by referring to a briefing, which I am sure many noble Lords have received, from a wide range of organisations, including Liberty, Big Brother Watch, the Open Rights Group, Article 19, the Electronic Frontier Foundation, Reset and Fair Vote. It is quite a range of organisations but very much in the human rights space, particularly the digital human rights space. The introduction of the briefing includes a sentence that gets to the heart of why many of us have received so many emails about this element of the Bill:
“None of us want to feel as though someone is looking over our shoulder when we are communicating”.
I want to take advantage of the noble Baroness having raised that point to say that perhaps I was not clear enough in my speech. While I absolutely agree about not everything, everybody, all the time, for my specific concerns around child sexual abuse, abuse of women and so on, we have to find new world order ways of creating targeted approaches so it does not have to be everything, everybody, all the time.
I am glad I gave the noble Baroness the opportunity for that intervention. I have a reasonable level of technical knowledge—I hand-coded my first website in 1999, so I go back some way—but given the structures we are dealing with, I question the capacity and whether it is possible to create the tools and say they will be used only in a certain way. If you break the door open, anyone can walk through the door—that is the situation we are in.
As the noble Lord, Lord Allan, said, this is a crucial part of the Bill that was not properly examined and worked through in the other place. I will conclude by saying that it is vital we have a full and proper debate in this area. I hope the Minister can reassure us that he and the department will continue to be in dialogue with noble Lords as the Bill goes forward.
My Lords, I rise to speak to Amendment 205 in my name, but like other noble Lords I will speak about the group as a whole. After the contributions so far, not least from the noble Lord, Lord Allan of Hallam, and the noble Baroness, Lady Bennett of Manor Castle, there is not a great deal left for me to add. However, I will say that we have to understand that privacy is contextual. At one extreme, I know the remarks I make in your Lordships’ House are going to be carefully preserved and cherished; for several centuries, if not millennia, people will be able to see what I said today. If I am in my sitting room, having a private conversation, I expect that not to be heard by somebody, although at the same time I am dimly aware that there might be somebody on the other side of the wall who can hear what I am saying. Similarly, I am aware that if I use the telephone, it is possible that somebody is listening to the call. Somebody may have been duly authorised to do so by reference to a tribunal, having taken all the lawful steps necessary in order to listen to that call, because there are reasons that have persuaded a competent authority that the police service, or whatever, listening to my telephone call has a reason to do so, to avoid public harm or meet some other justified objective agreed on through legislation.
Here, we are going into a sphere of encryption where one assumes privacy and feels one is entitled to some privacy. However, it is possible that the regulator could at any moment step in and demand records from the past—records up to that point—without the intervention of a tribunal, as far as I can see, or without any reference to a warrant or having to explain to anybody their basis for doing so. They would be able to step in and do it. This is the point made by the noble Baroness, Lady Bennett of Manor Castle: unlike the telephone conversation, where it does not have to be everyone, everywhere, all the time—they are listening to just me and the person to whom I am talking—the provider has to have the capacity to go back, get all those records and be able to show Ofcom what it is that Ofcom is looking for. To do that requires them to change their encryption model fundamentally. It is not really possible to get away from everyone, everywhere, all the time, because the model has to be changed in order to do it.
That is why this is such an astonishing thing for the Government to insert in this Bill. I can understand why the security services and so forth want this power, and this is a vehicle to achieve something they have been trying to achieve for a long time. But there is very strong public resistance to it, and it is entirely understood, and to do it in this space is completely at odds with the way in which we felt it appropriate to authorise listening in on private conversations in the past—specific conversations, with the authority of a tribunal. To do it this way is a very radical change and one that needs to be considered almost apart from the Bill, not slipped in as a mere clause and administrative adjunct to it.
My Lords, there have been some excellent speeches so far. The noble Lord, Lord Allan of Hallam, brilliantly laid out why these amendments matter, and the noble Lord, Lord Moylan, explained why this has gained popular interest outside of the House. Not everything that goes on in this House is of interest and people do not study all of the speeches made by the noble Lord, Lord Moylan, even though they are always in the public sphere, but this particular group of amendments has elicited a huge amount of discussion.
We should remember that encrypted chat has become an indispensable part of the way that we live in this country and around the world. According to the Open Rights Group it has replaced the old-fashioned wired telephone—a rather quaint phrase. The fact that the citizens of the United Kingdom think that chat services matter so much that they are used by 60% of the total population should make us think about what we are doing regarding these services.
End-to-end encryption—the most secure form of encryption available—means that your messages are stored on your phone; people feel that they are in control because they are not on some server somewhere. Even WhatsApp cannot read your WhatsApp messages; that is the point of encryption. That is why people use it: the messages are secured with a lock which only you and the recipient have the special key to unlock to read them.
Obviously, there are certain problems. Certain Government Ministers wanted to voluntarily share all of their WhatsApp messages with a journalist who would then share them with the rest of us. If your Lordships were in that group you might have thought that was a rude thing to do. People have their WhatsApp messages leaked all the time, and when it happens we all think, “Oh my God, I’m glad I wasn’t in that WhatsApp group”, because you assume a level of privacy, even though as a grown-up you need to remember that somebody might leak them. But the main point is that they are a secure form of conversation that is widely used.
Everyone has a right to private conversations. I was thinking about how, when society closed down during the lockdown period, we needed technology in order to communicate with each other. We understood that we needed to WhatsApp message or Zoom call our friends and family, and the idea that this would involve the state listening in would have appalled us—we considered that our private life.
We want to be able to chat in confidence and be confident that only ourselves and the recipients can see what we are sharing and hear what we are saying. That is true of everyday life, but there are very good specific cases to be made for its importance, ranging through everything from Iranian women resisting the regime and communicating with each other, to all the civil liberties organisations around the world that use WhatsApp. The security of knowing that you can speak without Putin listening in or that President Xi will not be sent your WhatsApp messages is important.
The Government keep assuring us that we do not need to worry, but the Bill gives Ofcom the power to require services to install tools that would require the surveillance of encrypted communications regarding child exploitation and terrorism content, for example. Advocates and people on my side argue that this is not possible without undermining encryption because, just as you cannot be half pregnant, you cannot be half encrypted once you install tools for scanning for certain content. There is a danger that we say, “We’re only doing it for those things”, but actually it would be an attack on encryption itself.
Unlike the noble Baroness, Lady Bennett of Manor Castle, I know nothing about the technical aspects of this, as noble Lords can hear from the way I am speaking about it. But I can see from a common-sense point of view what encryption is: you cannot say, “We’re only going to use it a little bit”. That is my point.
I want to tackle the issue of child abuse, because I know that it lurks around here. It is what really motivates the people who say, “It’s ok as long as we can deal with that”. This is put forward as a proposed solution to the problem of encrypted chat services that send messages of that nature and the question of what we can do about it. Of course I stress that images of child abuse and exploitation are abhorrent—that is a very important background to this conversation—but I want to draw attention to the question of what we are prepared to do about child abuse, because I think it was referred to in an earlier group. I am nervous that we are promising a silver bullet through this Bill that it will all be solved through some of these measures.
No one in the Committee or anyone standing behind us who speaks up for children thinks that this is going to be a silver bullet. It is unacceptable to suggest that we take that position. Much child abuse takes place offline and is then put online, but the exponential way in which it is consumed, created, and spread is entirely new because of the services we are talking about. Later in Committee I will explain some of the new ways in which it is creating child abuse—new forms, new technologies, new abuse.
I am sorry to interrupt the noble Baroness. I have made my feelings clear that I am not an end-to-end encryption “breaker”. There are amendments covering this; I believe some of them will come up later in the name of the noble Lord, Lord Russell, on safety by design and so on. I also agree with the noble Baroness that we need more resources in this area for the police, teachers, social workers and so on. However, I do not want child sexual abuse to be a football in this conversation.
I agree with the noble Baroness, which is precisely why I am suggesting that we need to consider whether privacy should be sacrificed totally in relation to the argument around encryption. It is difficult, and I feel awkward saying it. When I mentioned a silver bullet I was not talking about the noble Baroness or any other noble Lords present, but I have heard people say that we need this Bill because it will deal with child abuse. In this group of amendments, I am raising the fact that when I have talked about encryption with people outside of the House they have said that we need to do something to tackle the fact that these messages are being sent around. It is not just child abuse; it is also terrorism. There is a range of difficult situations.
Things can go wrong with this, and that is what I was trying to raise. For example, we have a situation where some companies are considering using, or are being asked to use, machine learning to detect nudity. Just last year, a father lost his Google account and was reported to the police for sending a naked photo of their child to the doctor for medical reasons. I am raising these as examples of the problems that we have to consider.
Child abuse is so abhorrent that we will do anything to protect children, but let me say this to the Committee, as it is where the point on privacy lies: children are largely abused in their homes, but as far as I understand it we are not as yet arguing that the state should put CCTV cameras in every home for 24/7 surveillance to stop child abuse. That does not mean that we are glib or that we do not understand the importance of child abuse; it means that we understand the privacy of your home. There are specialist services that can intervene when they think there is a problem. I am worried about the possibility of putting a CCTV camera in everyone’s phone, which is the danger of going down this route.
My final point is that these services, such as WhatsApp, will potentially leave the UK. It is important to note that. I agree with the noble Lord, Lord Allan: this is not like threatening to storm off. It is not done in any kind of pique in that way. In putting enormous pressure on these platforms to scan communications, we must remember that they are global platforms. They have a system that works for billions of people all around the world. A relatively small market such as the UK is not something for which they would compromise their billions of users around the world. As I have explained, they would not put up with it if the Chinese state said, “We have to see people’s messages”. They would just say, “We are encrypted services”. They would walk out of China and we would all say, “Well done”. There is a real, strong possibility of these services leaving the UK so we must be very careful.
I just want to add to the exchange between the noble Baronesses, Lady Kidron and Lady Fox. The noble Baroness, Lady Fox, referred to WhatsApp’s position. Again, it is important for the public out there also to understand that if someone sends them illegal material—in particular child sexual abuse material; I agree with the noble Baroness, Lady Kidron, that this is a real problem—and they report it to WhatsApp, which has a reporting system, that material is no longer encrypted. It is sent in clear text and WhatsApp will give it to the police. One of the things I am suggesting is that, rather than driving WhatsApp out of the country, because it is at the more responsible end of the spectrum, we should work with it to improve these kinds of reporting systems and put the fear of God into people so that they know that this issue is not cost-free.
As a coda to that, if you ever receive something like that, you should report it to the police straightaway because, once it is on your phone, you are liable and you have a problem. The message from here should be: if you receive it, report it and, if it is reported, make sure that it gets to the police. We should be encouraging services to put those systems in place.
The noble Lord has concluded with my conclusion, which was to say that those services will be driven out, but not because they are irresponsible around horrible, dangerous messages. They do not read our messages because they are private. However, if we ever receive anything that makes us feel uncomfortable, they should be put under pressure to act. Many of them already do and are actually very responsible, but that is different from demanding that they scan our messages and we breach that privacy.
My Lords, that last exchange was incredibly helpful. I am grateful to the noble Lord, Lord Allan, for what he just said and the way in which he introduced this group. I want to make only a few brief remarks.
I have put my name to two amendments in this group: Amendment 202 in the name of the noble Lord, Lord Stevenson, which seeks to ensure that Ofcom will be subject to the same kind of requirements and controls as exist under the Regulation of Investigatory Powers Act before issuing a technology notice
“to a regulated service which offers private messaging with end-to-end encryption”;
and Amendment 285, also in the name of the noble Lord, Lord Stevenson, and that of the noble Lord, Lord Clement-Jones. This amendment would make sure that no social media platforms or private end-to-end messaging services have an obligation generally to monitor what is going on across their platforms. When I looked at this group and the various amendments in it, those were the two issues that I thought were critical. These two amendments seemed to approach them in the most simple and straightforward manner.
Like other noble Lords, my main concern is that I do not want search and social media platforms to have an obligation to become what we might describe as thought police. I do not want private messaging firms to start collecting and storing the content of our messages so that they have what we say ready to hand over in case they are required to do so. What the noble Lord, Lord Allan, just said is an important point to emphasise. Some of us heard from senior representatives from WhatsApp a few weeks ago. I was quite surprised to learn how much they are doing in this area to co-operate with the authorities; I felt very reassured to learn about that. I in no way want to discourage that because they are doing an awful amount of good stuff.
Basically, this is such a sensitive matter, as has been said, that it is important for the Government to be clear what their policy intentions are by being clear in the Bill. If they do not intend to require general monitoring that needs to be made explicit. It is also important that, if Ofcom is to be given new investigatory powers or powers to insist on things through these technology notices, it is clear that its powers do not go beyond those that are already set out in law. As we have heard from noble Lords, there is widespread concern about this matter not just from the social media platforms and search engines themselves but from news organisations, journalists and those lobby groups that often speak out on liberty-type matters. These topics go across a wide range of interest groups, so I very much hope that my noble friend the Minister will be able to respond constructively and open-mindedly on them.
My Lords, I was not intending to intervene on this group because my noble friend Lord Stevenson will address these amendments in their entirety, but listening in to this public conversation about this group of amendments has stimulated a question that I want both to put on the record and to give the Minister time to reflect on.
If we get the issues of privacy and encrypted messaging wrong, it will push more people into using VPN—virtual private network—services. I went into the app store on my phone to search for VPN software. There is nothing wrong with such software—our parliamentary devices have it to do general monitoring and make sure that we do not use services such as TikTok—but it is used to circumnavigate much of the regulatory regime that we are seeking to put together through this Bill. When I search for VPNs in the app store, the first one that comes up that is not a sponsored, promoted advertisement has an advisory age limit of four years old. Several of them are the same; some are 17-plus but most are four-plus. Clearly, the app promotes itself very much on the basis that it offers privacy and anonymity, which are the key features of a VPN. However, a review of it says, “I wouldn’t recommend people use this because it turns out that this company sends all its users’ data to China so that it can do general monitoring”.
I am not sure how VPNs are being addressed by the Bill, even though they seem really pertinent to the issues of privacy and encryption. I would be interested to hear whether—and if we are, how—we are bringing the regulation and misuse of VPNs into scope for regulation by Ofcom.
My Lords, I would like to say something very quickly on VPN. I had a discussion with some teenagers recently, who were all prepared for this Bill—I was quite surprised that they knew a lot about it. They said, “Don’t worry, we’ve worked out how to get around it. Have you heard of VPN?” It reminded me of a visit to China, where I asked a group of students how they dealt with censorship and not being able to google. They said, “Don’t worry about it”, and showed me VPN. It is right that we draw attention to that. There is a danger of inadvertently forcing people on to the unregulated dark web and into areas that we might not imagine. That is why we have to be careful and proportionate in our response.
My Lords, I have long been on record as being for radical reform of the House of Lords, but I do not think there are many Chambers in the world that could have had such an interesting debate on such a key subject—certainly not the House of Commons, sadly. Without falling into the old trap of saying what a wonderful lot we all are, it is important that, in such an important Bill, covering so many important areas of civil liberties and national security, there should be an opportunity, before we get to voting, to have this kind of debate and get some of the issues into the public domain.
I am on the same side as the noble Baroness, Lady Fox, on knowledge of the technology—looking back to 20 years ago, when I was on the committee that worked on the communications Bill which set up Ofcom, I see that we were genuinely innocents abroad. We deliberately decided not to try regulating the internet, because we did not know what it was going to do. I do not think that we can have that excuse today.
Perhaps an even more frightening background is that, for three and a half years, during the coalition Government, I was Minister for Digital Protection—a less equipped Minister to protect your digital I cannot imagine. However, I remember being taken to some place over the river to have a look at our capacities in this area. Having seen some of the things that were being done, I rather timidly asked the expert who was showing me round, “Aren’t there civil liberty issues in what you’re doing?” He said, “Oh no, sir. Tesco know far more about you than we do”.
There is this element about what is secret. The noble Baroness, Lady Fox, in her last contribution, said that children look with contempt at some of the safeguards and blockages that keep them away from things. I do not think anybody is deluding themselves that there is some silver bullet. As always, Parliament must do its best to address real national concerns and real problems in the best way that we see at this time. There is a degree of cross-party and Cross-Bench unity, in that there are real and present dangers in how these technologies are being used, and real and present abuses of a quite horrific kind. The noble Baroness, Lady Kidron, is right. This technology has given a quantum leap to the damage that the abuser and the pornographer can do to our society, in the same way that it has given a quantum leap to those who want to undermine the truth and fairness of our election system. There are real problems that must be addressed.
Although it has not been present in this debate, it is no help to polarise the argument as being between the state wanting to accrue more and more powers and brave defenders of civil liberties. As somebody who has practised some of these dark arts myself, I advise those who are organising letters to ensure that those sending them do not leave in the paragraph that says, “Here you may want to include some personal comments”. It waters down the credibility of this as some independent exercising of a democratic right.
I make a plea, as someone on the edges of the debate who at times had some direct responsibilities, to use what the Bill has thrown up to address whether it is now in the right shape—I hope the Minister hears it. The Government should not be ashamed to take it away and think a bit. It may be that we can add some of the protections that we quite often do, such as allowing certain interventions after a judge or senior police officer or others have been involved. That may already be in other parts of the Bill. However, it would be wrong to allow the Bill to polarise this, given that there was no one who spoke this morning who is not trying to deal with very real difficulties, problems and challenges, within the framework of a democratic society, in a way that protects our freedoms but also protects us from real and present dangers.
My Lords, this is the first time that I have spoken on the Bill in Committee. I know noble Lords are keen to move on and get through the groups as quickly as possible, but I hope they will forgive me if I say that I will speak only about twice on the Bill, and this is one of the groups that I want to speak to. I will try not to make your Lordships impatient.
I should tell the Committee a little about where I am coming from. I was very geeky as a kid. I learned to program and code. I did engineering at university and coded there. My master’s degree in the late 1980s was about technology and policy, so I have been interested in technology policy since then, having followed it through in my professional life. In 1996, I wrote a book on EU telecoms—it sold so well that no one has ever heard of it. One thing I said in that book, which though not an original thought is pertinent today, is that the regulation will always be behind the technology. We will always play catch-up, and we must be concerned about that.
Interestingly, when you look at studies of technology adoption—pioneers, early adopters and then the rest of the population—quite often you see that the adult industry is at the leading edge, such as with cable TV, satellite TV, video cassettes, online conferencing, et cetera. I assure your Lordships that I have not done too much primary research into this, but it is an issue that we ought to be aware of.
I will not speak often in this debate, because there are many issues that I do not want to disagree on. For example, I have already had a conversation with the noble Baroness, Lady Kidron, and we all agree that we need to protect children. We also know that we need to protect vulnerable adults; there is no disagreement on that. However, in these discussions there will be inevitable trade-offs between security and safety and freedom. It is right to have these conversations to ensure that we get the balance right, with the wisdom of noble Lords. Sacrifices will be made on either side of the debate, and we should be very careful as we navigate this.
I am worried about some of the consequences for freedom of expression. When I was head of a research think tank, one of the phenomena that I became interested in was that of unintended consequences. Well-meaning laws and measures have often led to unintended consequences. Some people call it a law of unintended consequences, and some call it a principle, and we should be careful about this. The other issue is subjectivity of harms. Given that we have taken “legal but harmful” out and there are amendments to the Bill to tackle harms, there will be a debate on the subjectivity of harms.
One reason I wanted to speak on this group is that some of the amendments tabled by noble Lords—too many to mention—deal with technology notices and ensuring that we are consistent between the offline and online worlds, particularly regarding the Regulation of Investigatory Powers Act. I welcome and support those amendments.
We also have to be aware that people will find a way around it, as the noble Baroness, Lady Fox, said. When I was looking at terrorism and technology, one of the issues that people raised with me was not to forget that one way around it was to create an email account and store stuff in a draft folder. You could then share the username and password with others who could then access that data, those pictures or those instructions in a draft folder. The noble Lord, Lord Allan, has gone some way to addressing that issue.
The other issue that we have to be clear about is how the tech sector can do more. It was interesting when my noble friend Lady Stowell organised a meeting with Meta, which was challenged particularly on having access to information and pictures from coroners. It was very interesting when Meta told us what it could access: it does not know what is in the messages, but there are things that it can access, or advise people to access, on the user’s phone or at the other end. I am not sure whether the noble Baroness, Lady Kidron, has had the conversation with Meta, but it would be helpful and important to find some common ground there, and to probe and push Meta and others to make sure that they share that information more quickly, so we do not have to wait five years to get it via the coroner or whatever. We ought to push that as much as possible.
I want to talk in particular about unintended consequences, particularly around end-to-end encryption. Even if you do not believe the big businesses and think that they are crying wolf when they say that they will quit the UK—although I believe that there is a threat of that, particularly when we continually want the UK to be a global hub for technology and innovation and so cannot afford for companies such as Meta, Signal and others to leave—you should listen to the journalists who are working with people, quite often dissidents, in many countries, and rely on encrypted communications to communicate with them.
The other risk we should be aware of is that it is very difficult to keep technology to a few people. In my academic career, I also looked at technology transfer, both intentional and unintentional. We should look at the intelligence services and some of the innovations that happened: for example, when Concorde was designed, it was not very long after that the Soviets got their hands on that equipment. Just as there used to be a chap called Bob in the exchange who could share information, there is always a weak spot in chains: the humans. Lots of humans have a price and can be bought, or they can be threatened, and things can be shared. The unintended consequence I am worried about is that this technology will get into the hands of totalitarian regimes. At the same time, it means people over here who are really trying desperately to help dissidents and others speak up for freedom in other countries will be unable to support them. We should be very careful and think about unintended consequences. For that reason, I support this group of amendments.
I really am looking forward to the responses from the Minister. I know that the noble Lord, Lord McNally, said that he was a Minister for three years on data protection; I was a Minister in this department for one month. I was so pleased that I had my dream job, as Minister for Civil Society and Heritage, and so proud of my party and this country because we had elected the first Asian Prime Minister; then, six days later, I got sacked. So, as they say, be careful what you wish for.
In this particular case, I am grateful to the noble Lords who have spoken up in this debate. I do not want to repeat any other points but just wanted to add that. I will not speak often, but I want to say that it is really critical that, when we look at this trade-off between security, safety and freedom, we get it right. One way of doing that is to make sure that, on technology notices and RIPA, we are consistent between the online and offline worlds.
My Lords, it has been a very good debate indeed. When I first saw this grouping, my heart sank: the idea that we should be able to encompass all that within the space of just over an hour seemed a bit beyond all of us, however skilled and experienced we were, and whatever background we were able to bring to the debate today. I agree with both noble Lords who observed that we have an expertise around here that is very unusual and extremely helpful in trying to drill down into some of these issues.
The good thing that has come out from this debate, which was summed up very well by the noble Lord, Lord Kamall, is that we are now beginning to address some of the underlying currents that the Bill as a boat is resting on—and the boat is a bit shaky. We have a very strong technological bias, and we are grateful for the masterclass from the noble Lord, Lord Allan of Hallam, on what is actually going on in the world that we are trying to legislate for. It leaves me absolutely terrified that we are in a situation where we appear to be trying to future-proof, possibly in the wrong direction. We should be very careful about that. We will want to reflect on the point he made on where the technology is driving this particular aspect of our social media and search engine operations.
My Lords, this has indeed been a very good debate on a large group of amendments. We have benefited from two former Ministers, the noble Lord, Lord McNally, and my noble friend Lord Kamall. I hope it is some solace to my noble friend that, such a hard act is he to follow, his role has been taken on by two of us on the Front Bench—myself at DCMS and my noble friend Lord Camrose at the new Department for Science, Innovation and Technology.
The amendments in this group are concerned with the protection of user privacy under the Bill and the maintenance of end-to-end encryption. As noble Lords have noted, there has been some recent coverage of this policy in the media. That reporting has not always been accurate, and I take this opportunity to set the record straight in a number of areas and seek to provide the clarity which the noble Lord, Lord Stevenson of Balmacara, asked for just now.
Encryption plays a crucial role in the digital realm, and the UK supports its responsible use. The Bill does not ban any service design, nor will it require services materially to weaken any design. The Bill contains strong safeguards for privacy. Broadly, its safety duties require platforms to use proportionate systems and processes to mitigate the risks to users resulting from illegal content and content that is harmful to children. In doing so, platforms must consider and implement safeguards for privacy, including ensuring that they are complying with their legal responsibilities under data protection law.
With regard to private messaging, Ofcom will set out how companies can comply with their duties in a way that recognises the importance of protecting users’ privacy. Importantly, the Bill is clear that Ofcom cannot require companies to use proactive technology, such as automated scanning, on private communications in order to comply with their safety duties.
In addition to these cross-cutting protections, there are further safeguards concerning Ofcom’s ability to require the use of proactive technology, such as content identification technology on public channels. That is in Clause 124(6) of the Bill. Ofcom must consider a number of matters, including the impact on privacy and whether less intrusive measures would have the equivalent effect, before it can require a proactive technology.
The implementation of end-to-end encryption in a way that intentionally blinds companies to criminal activity on their services, however, has a disastrous effect on child safety. The National Center for Missing & Exploited Children in the United States of America estimates that more than half its reports could be lost if end-to-end encryption were implemented without preserving the ability to tackle child sexual abuse—a conundrum with which noble Lords grappled today. That is why our new regulatory framework must encourage technology companies to ensure that their safety measures keep pace with this evolving and pernicious threat, including minimising the risk that criminals are able to use end-to-end encrypted services to facilitate child sexual abuse and exploitation.
Given the serious risk of harm to children, the regulator must have appropriate powers to compel companies to take the most effective action to tackle such illegal and reprehensible content and activity on their services, including in private communications, subject to stringent legal safeguards. Under Clause 110, Ofcom will have a stand-alone power to require a provider to use, or make best endeavours to develop, accredited technology to tackle child sexual exploitation and abuse, whether communicated publicly or privately, by issuing a notice. Ofcom will use this power as a last resort only when all other measures have proven insufficient adequately to address the risk. The only other type of harm for which Ofcom can use this power is terrorist content, and only on public communications.
The use of the power in Clause 110 is subject to additional robust safeguards to ensure appropriate protection of users’ rights online. Ofcom will be able to require the use of technology accredited as being highly accurate only in specifically detecting illegal child sexual exploitation and abuse content, ensuring a minimal risk that legal content is wrongly identified. In addition, under Clause 112, Ofcom must consider a number of matters, including privacy and whether less intrusive means would have the same effect, before deciding whether it is necessary and proportionate to issue a notice.
The Bill also includes vital procedural safeguards in relation to Ofcom’s use of the power. If Ofcom concludes that issuing a notice is necessary and proportionate, it will need to publish a warning notice to provide the company an opportunity to make representations as to why the notice should not be issued or why the detail contained in it should be amended. In addition, the final notice must set out details of the rights of appeal under Clause 149. Users will also be able to complain to and seek action from a provider if the use of a specific technology results in their content incorrectly being removed and if they consider that technology is being used in a way that is not envisaged in the terms of service. Some of the examples given by the noble Baroness, Lady Fox of Buckley, pertain in this instance.
The Bill also recognises that in some cases there will be no available technology compatible with the particular service design. As I set out, this power cannot be used by Ofcom to require a company to take any action that is not proportionate, including removing or materially weakening encryption. That is why the Bill now includes an additional provision for this scenario, to allow Ofcom to require technology companies to use their best endeavours to develop or find new solutions that work on their services while meeting the same high standards of accuracy and privacy protection. Given the ingenuity and resourcefulness of the sector, it is reasonable to ask it to do everything possible to protect children from abuse and exploitation. I echo the comments made by the noble Lord, Lord Allan, about the work being done across the sector to do that.
More broadly, the regulator must uphold the right to privacy under its Human Rights Act obligations when implementing the new regime. It must ensure that its actions interfere with privacy only where it is lawful, necessary and proportionate to do so. I hope that addresses the question posed by the noble Lord, Lord Stevenson. In addition, Ofcom will be required to consult the Information Commissioner’s Office when developing codes of practice and relevant pieces of guidance.
I turn now to Amendments 14—
Before the Minister does so, can he give a sense of what he means by “best endeavours” for those technology companies? If it is not going to be general monitoring of what is happening as the message moves from point to point—we have had some discussions about the impracticality and issues attached to monitoring at one end or the other—what, theoretically, could “best endeavours” possibly look like?
I am hesitant to give too tight a definition, because we want to remain technology neutral and make sure that we are keeping an open mind to developing changes. I will think about that and write to the noble Lord. The best endeavours will inevitably change over time as new technological solutions present themselves. I point to the resourcefulness of the sector in identifying those, but I will see whether there is anything more I can add.
While the Minister is reflecting, I note that the words “best endeavours” are always a bit of a worry. The noble Lord, Lord Allan, made the good point that once it is on your phone, you are in trouble and you must report it, but the frustration of many people outside this Chamber, if it has been on a phone and you cannot deal with it, is what comes next to find the journey of that piece of material without breaking encryption. I speak to the tech companies very often—indeed, I used to speak to the noble Lord, Lord Allan, when he was in position at then Facebook—but that is the question that we would like answered in this Committee, because the frustration that “It is nothing to do with us” is where we stop with our sympathy.
The noble Baroness’s intervention has given me an opportunity to note that I am about to say a little more on best endeavours, which will not fully answer the question from the noble Lord, Lord Knight, but I hope fleshes it out a little more.
I do that in turning to Amendments 14, 108 and 205, which seek to clarify that companies will not be required to undertake fundamental changes to the nature of their service, such as the removal or weakening of end-to-end encryption. As I previously set out, the Bill does not require companies to weaken or remove any design and there is no requirement for them to do so as part of their risk assessments or in response to a notice. Instead, companies will need to undertake risk assessments, including consideration of risks arising from the design of their services, before taking proportionate steps to mitigate and manage these risks. Where relevant, assessing the risks arising from end-to-end encryption will be an integral part of this process.
This risk management approach is well established in almost every other industry and it is right that we expect technology companies to take user safety into account when designing their products and services. We understand that technologies used to identify child sexual abuse and exploitation content, including on private communications, are in some cases nascent and complex. They continue to evolve, as I have said. That is why Ofcom has the power through the Bill to issue a notice requiring a company to make best endeavours to develop or source technology.
This notice will include clear, proportionate and enforceable steps that the company must take, based on the relevant information of the specific case. Before issuing a warning notice, Ofcom is expected to enter into informal consultation with the company and/or to exercise information-gathering powers to determine whether a notice is necessary and proportionate. This consultation period will assist in establishing what a notice to develop a technology may require and appropriate steps for the company to take to achieve best endeavours. That dialogue with Ofcom is part of the process.
There are a lot of phrases here—best endeavour, proportionate, appropriate steps—that are rather subjective. The concern of a number of noble Lords is that we want to address this issue but it is a matter of how it is applied. That is one of the reasons why noble Lords were asking for some input from the legal profession, a judge or otherwise, to make those judgments.
All the phrases used in the Bill are subject to the usual scrutiny through the judicial process—that is why we debate them now and think about their implications—but of course they can, and I am sure will, be tested in the usual legal ways. Once a company has developed a new technology that meets minimum standards of accuracy, Ofcom may require its use but not before considering matters including the impact on user privacy, as I have set out. The Bill does not specify which tools are likely to be required, as we cannot pre-empt Ofcom’s evidence-based and case-by-case assessment.
Amendment 285 intends to clarify that social media platforms will not be required to undertake general monitoring of the activity of their users. I agree that the protection of privacy is of utmost importance. I want to reassure noble Lords, in particular my noble friend Lady Stowell of Beeston, who asked about it, that the Bill does not require general monitoring of all content. The clear and strong safeguards for privacy will ensure that users’ rights are protected.
Setting out clear and specific safeguards will be more effective in protecting users’ privacy than adopting the approach set out in Amendment 285. Ofcom must consider a number of matters, including privacy, before it can require the use of proactive technology. The government amendments in this group, Amendments 290A to 290G, further clarify that technology which identifies words, phrases or images that indicate harm is subject to all of these restrictions. General monitoring is not a clearly defined concept—a point made just now by my noble friend Lord Kamall. It is used in EU law but is not defined clearly in that, and it is not a concept in UK law. This lack of clarity could create uncertainty that some technology companies might attempt to exploit in order to avoid taking necessary and proportionate steps to protect their users. That is why we resist Amendment 285.
I understand the point the Minister is making, but it is absolutely crystal clear that, whatever phrase is used, the sensibility is quite clear that the Government are saying on record, at the Dispatch Box, that the Bill can in no way be read as requiring anybody to provide a view into private messaging or encrypted messaging unless there is good legal cause to suspect criminality. That is a point that the noble Baroness, Lady Stowell, made very clearly. One may not like the phrasing used in other legislatures, but could we find a form of words that will make it clear that those who are operating in this legal territory are absolutely certain about where they stand on that?
My Lords, I want to give clear reassurance that the Bill does not require general monitoring of all content. We have clear and strong safeguards for privacy in the Bill to ensure that users’ rights are protected. I set out the concerns about use of the phrase “general monitoring”. I hope that provides clarity, but I may have missed the noble Lord’s point. The brief answer to the question I think he was asking is yes.
Let the record stand clear: yes. It was the slight equivocation around how the Minister approached and left that point that I was worried about, and that people might seek to use that later. Words from the Dispatch Box are never absolute and they are never meant to be, but the fact that they have been said is important. I am sure that everybody understands that point, and the Minister did say “yes” to my question.
I did, and I am happy to say it again: yes.
Perhaps I might go back to an earlier point. When the Minister said the Government want to make sure, I think he was implying that certain companies would try to avoid obligations to keep their users safe by threatening to leave or whatever. I want it to be clear that the obligations to the users of the service are, in the instance of encrypted services, to protect their privacy, and they see that as keeping them safe. It would be wrong to make that a polar opposite. I think that companies that run unencrypted services believe that to be what their duties are—so that in a way is a clash.
Secondly, I am delighted by the clarity in the Minister’s “yes” answer, but I think that maybe there needs to be clearer communication with people outside this Chamber. People are worried about whether duties placed on Ofcom to enact certain things would lead to some breach of encryption. No one thinks that the Government intend to do this or want to spy on anyone, but that the unintended consequences of the duty on Ofcom might have that effect. If that is not going to be the case, and that can be guaranteed by the Government, and they made that clear, it would reassure not just the companies but the users of messaging services, which would be helpful.
The points the noble Baroness has just made bring me neatly to what I was about to say in relation to the question raised earlier by the noble Lord, Lord Knight of Weymouth. But first, I would say that Ofcom as a public body is subject to public law principles already, so those apply in this case.
The noble Lord, Lord Knight, asked about virtual private networks and the risk of displacing people on to VPNs or other similar alternatives. That is a point worth noting, not just in this group but as we consider all these amendments, particularly when we talk later on about age verification, pornography and so on. Services will need to think about how safety measures could be circumvented and take steps to prevent that, because they need to mitigate risk effectively. There may also be a role in enforcement action, too; Ofcom will be able to apply to the courts to require these services where appropriate to apply business disruption measures. We should certainly be mindful of the incentives for people to do that, and the example the noble Lord, Lord Knight, gave earlier is a useful lesson in the old adage “Caveat emptor” when looking at some of these providers.
I want to say a little bit about Amendments 205A and 290H in my name. Given the scale of child sexual abuse and exploitation that takes place online, and the reprehensible nature of these crimes, it is important that Ofcom has effective powers to require companies to tackle it. This brings me to these government amendments, which make small changes to the powers in Clause 110 to ensure that they are effective. I will focus particularly, in the first instance, on Amendment 290H, which ensures that Ofcom considers whether a service has features that allow content to be shared widely via another service when deciding whether content has been communicated publicly or privately, including for the purposes of issuing a notice. This addresses an issue highlighted by the Independent Reviewer of Terrorism Legislation, Jonathan Hall, and Professor Stuart Macdonald in a recent paper. The separate, technical amendment, Amendment 205A, clarifies that Clause 110(7) refers only to a notice on a user-to-user service.
Amendment 190 in the name of the noble Lord, Lord Clement-Jones, seeks to introduce a new privacy duty on Ofcom when considering whether to use any of its powers. The extensive privacy safeguards that I have already set out, along with Ofcom’s human rights obligations, would make this amendment unnecessary. Ofcom must also explicitly consult persons whom it considers to have expertise in the enforcement of the criminal law and the protection of national security, which is relevant to online safety matters in the course of preparing its draft codes. This may include the integrity and security of internet services where relevant.
Amendments 202 and 206, in the name of the noble Lord, Lord Stevenson of Balmacara, and Amendments 207, 208, 244, 246, 247, 248, 249 and 250 in the name of the noble Lord, Lord Clement-Jones, all seek to deliver privacy safeguards to notices issued under Clause 110 through additional review and appeals processes. There are already strong safeguards concerning this power. As part of the warning notice process, companies will be able to make representations to Ofcom which it is bound to consider before issuing a notice. Ofcom must also review any notice before the end of the period for which it has effect.
Amendment 202 proposes mirroring the safeguards of the investigatory powers Act when issuing notices to encrypted messaging services under this power. First, this would be inappropriate, because the powers in the investigatory powers Act serve different purposes from those in this Bill. The different legal safeguards in the investigatory powers Act reflect the potential intrusion by the state into an individual’s private communications; that is not the case with this Bill, which does not grant investigatory powers to state bodies, such as the ability to intercept private communications. Secondly, making a reference to encryption would be—
Is that right? I do not need a yes or no answer. It was rhetorical; I am just trying to frame the right question. The Minister is making a very strong point about the difference between RIPA requirements and those that might be brought in under this Bill. But it does not really get to the bottom of the questions we were asking. In this situation, whatever the exact analogy between the two systems is, it is clear that Ofcom is marking its own homework—which is fair enough, as there are representations, but it is not getting external advice or seeking judicial approval.
The Minister’s point was that that was okay because it was private companies involved. But we are saying here that these would be criminal offences taking place and therefore there is bound to be interest from the police and other agencies, including anti-terrorism agencies. It is clearly similar to the RIPA arrangements, so he could he just revisit that?
Yes, I think it is right. The investigatory powers Act is a tool for law enforcement and intelligence agencies, whereas the Bill is designed to regulate technology companies—an important high-level distinction. As such, the Bill does not grant investigatory powers to state bodies. It does not allow the Government or the regulator to access private messages. Instead, it requires companies to implement proportionate systems and processes to tackle illegal content on their platforms. I will come on to say a little about legal redress and the role of the courts in looking at Ofcom’s decisions so, if I may, I will respond to that in a moment.
The investigatory powers Act includes a different form of technical notice, which is to put in place surveillance equipment. The noble Lord, Lord Stevenson, has a good point: we need to ensure that we do not have two regimes, both requiring companies to put in place technical equipment but with quite different standards applying.
I will certainly take that point away and I understand, of course, that different Acts require different duties of the same platforms. I will take that away and discuss it with colleagues in other departments who lead on investigatory powers.
Before my noble friend moves on, when he is reviewing that back in the office, could he also satisfy himself that the concerns coming from the journalism and news organisations in the context of RIPA are also understood and have been addressed? That is another angle which, from what my noble friend has said so far, I am not sure has really been acknowledged. That is not a criticism but it is worth him satisfying himself on it.
I am about to talk about the safeguards for journalists in the context of the Bill and the questions posed by the noble Baroness, Lady Bennett. However, I take my noble friend’s point about the implications of other Acts that are already on the statute book in that context as well.
Just to finish the train of thought of what I was saying on Amendment 202, making a reference to encryption, as it suggests, would be out of step with the wider approach of the Bill, which is to remain technology-neutral.
I come to the safeguards for journalistic protections, as touched on by the noble Baroness, Lady Bennett. The Government are fully committed to protecting the integrity of journalistic sources, and there is no intention or expectation that the tools required to be used under this power would result in a compromising of those sources. Any tools required on private communications must be accredited by Ofcom as highly accurate only in detecting child sexual abuse and exploitation content. These minimum standards of accuracy will be approved and published by the Secretary of State, following advice from Ofcom. We therefore expect it to be very unlikely that journalistic content will be falsely detected by the tools being required.
Under Clause 59, companies are obliged to report child sexual abuse material which is detected on their service to the National Crime Agency; this echoes a point made by the noble Lord, Lord Allan, in an earlier contribution. That would include child sexual abuse and exploitation material identified through tools required by a notice and, even in this event, the appropriate protections in relation to journalistic sources would be applied by the National Crime Agency if it were necessary to identify individuals involved in sharing illegal material.
I want to flag that in the context of terrorist content, this is quite high risk for journalists. It is quite common for them, for example, to be circulating a horrific ISIS video not because they support ISIS but because it is part of a news article they are putting together. We should flag that terrorist content in particular is commonly distributed by journalists and it could be picked up by any system that is not sufficiently sophisticated.
I see that my noble friend Lord Murray of Blidworth has joined the Front Bench in anticipation of the lunch-break business for the Home Office. That gives me the opportunity to say that I will discuss some of these points with him, my noble friend Lord Sharpe of Epsom and others at the Home Office.
Amendment 246 aims to ensure that there is no requirement for a provider to comply with a notice until the High Court has determined the appeal. The Government have ensured that, in addition to judicial review through the High Court, there is an accessible and relatively affordable alternative means of appealing Ofcom’s decisions via the Upper Tribunal. We cannot accept amendments such as this, which could unacceptably delay Ofcom’s ability to issue a notice, because that would leave children vulnerable.
To ensure that Ofcom’s use of its powers under Clause 110, and the technology that underpins it, are transparent, Ofcom will produce an annual report about the exercise of its functions using these powers. This must be submitted to the Secretary of State and laid before Parliament. The report must also provide the details of technology that has been assessed as meeting minimum standards of accuracy, and Ofcom may also consider other factors, including the impact of technologies on privacy. That will be separate to Ofcom’s annual report to allow for full scrutiny of this power.
The legislation also places a statutory requirement on Ofcom to publish guidance before its functions with regard to Clause 110 come into force. This will be after Royal Assent, given that the legislation is subject to change until that point. Before producing the guidance, Ofcom must consult the Information Commissioner. As I said, there are already strong safeguards regarding Ofcom’s use of these powers, so we think that this additional oversight is unnecessary.
Amendments 203 and 204, tabled by the noble Lord, Lord Clement-Jones, seek to probe the privacy implications of Ofcom’s powers to require technology under Clause 110. I reiterate that the Bill will not ban or weaken any design, including end-to-end encryption. But, given the scale of child sexual abuse and exploitation taking place on private communications, it is important that Ofcom has effective powers to require companies to tackle this abhorrent activity. Data from the Office for National Statistics show that in nearly three-quarters of cases where children are contacted online by someone they do not know, this takes place by private message. This highlights the scale of the threat and the importance of technology providers taking steps to safeguard children in private spaces online.
As already set out, there are already strong safeguards regarding the use of this power, and these will prevent Ofcom from requiring the use of any technology that would undermine a platform’s security and put users’ privacy at risk. These safeguards will also ensure that platforms will not be required to conduct mass scanning of private communications by default.
Until the regime comes into force, it is of course not possible to say with certainty which tools would be accredited. However, some illustrative examples of the kinds of current tools we might expect to be used—providing that they are highly accurate and compatible with a service’s design—are machine learning or artificial intelligence, which assess content to determine whether it is illegal, and hashing technology, which works by assigning a unique number to an image that has been identified as illegal.
Given the particularly abhorrent nature of the crimes we are discussing, it is important that services giving rise to a risk of child sexual abuse and exploitation in the UK are covered, wherever they are based. The Bill, including Ofcom’s ability to issue notices in relation to this or to terrorism, will therefore have extraterritorial effect. The Bill will apply to any relevant service that is linked to the UK. A service is linked to the UK if it has a significant number of UK users, if UK users form a target market or if the service is capable of being used in the UK and there is a material risk of significant harm to individuals in the UK arising from the service. I hope that that reassures the noble Lord, on behalf of his noble friend, about why that amendment is not needed.
Amendments 209 to 214 seek to place additional requirements on Ofcom to consider the effect on user privacy when using its powers under Clause 110. I agree that tackling online harm needs to take place while protecting privacy and security online, which is why Ofcom already has to consider user privacy before issuing notices under Section 110, among the other stringent safeguards I have set out. Amendment 202A would impose a duty on Ofcom to issue a notice under Clause 110, where it is satisfied that it is necessary and proportionate to do so—this will have involved ensuring that the safeguards have been met.
Ofcom will have access to a wide range of information and must have the discretion to decide the most appropriate course of action in any particular scenario, including where this action lies outside the powers and procedures conferred by Clause 110; for instance, an initial period of voluntary engagement. This is an in extremis power. It is essential that we balance users’ rights with the need to enable a strong response, so Ofcom must be able to assess whether any alternative, less intrusive measures would effectively reduce the level of child sexual exploitation and abuse or terrorist content occurring on a service before issuing a notice.
I hope that that provides reassurance to noble Lords on the amendments in this group, and I invite the noble Lord to withdraw Amendment 14.
My Lords, this has been a very useful debate and serves as a good appetite builder for lunch, which I understand we will be able to take shortly.
I am grateful to the Minister for his response and to all noble Lords who have taken part in the debate. As always, the noble Baroness, Lady Kidron, gave us a balanced view of digital rights—the right to privacy and to security—and the fact that we should be trying to advance these two things simultaneously. She was right again to remind us that this is a real problem and there is a lot we can do. I know she has worked on this through things such as metadata—understanding who is communicating with whom—which might strike that nice balance where we are not infringing on people’s privacy too grossly but are still able to identify those who wish harm on our society and in particular on our children.
The noble Baroness, Lady Bennett, was right to pick up this tension between everything, everywhere, all at once and targeted surveillance. Again, that is really interesting to tease out. I am personally quite comfortable with quite intrusive targeted surveillance. I do not know whether noble Lords have been reading the Pegasus spyware stories: I am not comfortable with some Governments placing such spyware on the phones of human rights defenders but I would be much more relaxed about the British authorities placing something similar on the phones of people who are going to plant bombs in Manchester. We need to be really honest about where we are drawing our red lines if we want to go in the direction of targeted surveillance.
The noble Lord, Lord Moylan, was right again to remind us about the importance of private conversations. I cited the example of police officers whose conversations have been exposed. Although it is hard, we should remember that if ordinary citizens want to exchange horrible racist jokes with each other and so on in private groups that is not a matter for the state, but it is when it is somebody in a position of public authority; we have a right to intervene there. Again, we have to remember that as long as it is not illegal people can say horrible things in private, and we should not encourage any situation where we suggest that the state would interfere unless there are legitimate grounds—for example, it is a police officer or somebody is doing something that crosses the line of legality.
The noble Baroness, Lady Fox, reminded us that it is either encrypted or it is not. That is really helpful, as things cannot be half encrypted. If a service provider makes a commitment it is critical that it is truthful. That is what our privacy law tells us. If I say, “This service is encrypted between you and the person you send the message to”, and I know that there is somebody in between who could access it, I am lying. I cannot say it is a private service unless it is truly private. We have to bear that in mind. Historically, people might have been more comfortable with fudging it, but not in 2023, when have this raft of privacy legislation.
The noble Baroness is also right to remind us that privacy can be safety. There is almost nothing more devastating than the leaking of intimate images. When services such as iCloud move to encrypted storage that dramatically reduces the risk that somebody will get access to your intimate images if you store them there, which you are legally entitled to do. Privacy can be a critical part of an individual maintaining their own security and we should not lose that.
The noble Baroness, Lady Stowell, was right again to talk about general monitoring. I am pleased that she found the WhatsApp briefing useful. I was unable to attend but I know from previous contact that there are people doing good work and it is sad that that often does not come out. We end up with this very polarised debate, which my noble friend Lord McNally was right to remind us is unhelpful. The people south of the river are often working very closely in the public interest with people in tech companies. Public rhetoric tends to focus on why more is not being done; there are very few thanks for what is being done. I would like to see the debate move a little more in that direction.
The noble Lord, Lord Knight, opened up a whole new world of pain with VPNs, which I am sure we will come back to. I say simply that if we get the regulatory frameworks right, most people in Britain will continue to use mainstream services as long as they are allowed to be offered. If those services are regulated by the European Union under its Digital Services Act and pertain to the UK and the US in a similar way, they will in effect have global standards, so it will not matter where you VPN from. The scenario the noble Lord painted, which I worry about, is where those mainstream services are not available and we drive people into small, new services that are not regulated by anyone. We would then end up inadvertently driving people back to the wild west that we complain about, when most of them would prefer to use mainstream services that are properly regulated by Ofcom, the European Commission and the US authorities.