Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I will build on my noble friend’s comments. We have what I call the Andrew Tate problem. That famous pornographer and disreputable character started a business in a shed in Romania with a dozen employees. By most people’s assessment, it would have been considered a small business but, through his content of pornography and the physical assault of women, he extremely quickly built something that served an estimated 3 billion pages, and it has had a huge impact on the children of the English-speaking world. A small business became a big, nasty business very quickly. That anecdote reinforces the point that small does not mean safe, and, although I agree with many of my noble friend’s points, the lens of size is perhaps not the right one to look through.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

My Lords, I did not want to interrupt the noble Lord, Lord Moylan, in full flow as he introduced the amendments, but I believe he made an error in terms of the categorisation. The error is entirely rational, because he took the logical position rather than the one in the Bill. It is a helpful error because it allows us to quiz the Minister on the rationale for the categorisation scheme.

As I read it, in Clause 86, the categories are: category 1, which is large user-to-user services; category 2A, which is search or combined services; and category 2B, which is small user-to-user services. To my boring and logical binary brain, I would expect it to be: “1A: large user-to-user”; “1B: small user-to-user”; “2A: large search”; and “2B: small search”. I am curious about why a scheme like that was not adopted and we have ended up with something quite complicated. It is not only that: we now have this Part 3/Part 5 thing. I feel that we will be confused for years to come: we will be deciding whether something is a Part 3 2B service or a Part 5 service, and we will end up with a soup of numbers and letters that do not conform to any normal, rational approach to the world.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I must first apologise for my slightly dishevelled appearance as I managed to spill coffee down my shirt on my way to the Chamber. I apologise for that—as the fumes from the dried coffee suffuse the air around me. It will certainly keep me caffeinated for the day ahead.

Search services play a critical role in users’ online experience, allowing them easily to find and access a broad range of information online. Their gateway function, as we have discussed previously, means that they also play an important role in keeping users safe online because they have significant influence over the content people encounter. The Bill therefore imposes stringent requirements on search services to tackle the risks from illegal content and to protect children.

Amendments 13, 15, 66 to 69 and 73 tabled by my noble friend Lord Moylan seek to narrow the scope of the Bill so that its safety search duties apply only to the largest search services—categorised in the Bill as category 2A services—rather than to all search services. Narrowing the scope in this way would have an adverse impact on the safety of people using search services, including children. Search services, including combined services, below the category 2A threshold would no longer have a duty to minimise the risk of users encountering illegal content or children encountering harmful content in or via search results. This would increase the likelihood of users, including children, accessing illegal content and children accessing harmful content through these services.

The Bill already takes a targeted approach and the duties on search services will be proportionate to the risk of harm and the capacity of companies. This means that services which are smaller and lower-risk will have a lighter regulatory burden than those which are larger and higher-risk. All search services will be required to conduct regular illegal content risk assessments and, where relevant, children’s risk assessments, and then implement proportionate mitigations to protect users, including children. Ofcom will set out in its codes of practice specific steps search services can take to ensure compliance and must ensure that these are proportionate to the size and capacity of the service.

The noble Baroness, Lady Kidron, and my noble friend Lady Harding of Winscombe asked how search services should conduct their risk assessments. Regulated search services will have a duty to conduct regular illegal content risk assessments, and where a service is likely to be accessed by children it will have a duty to conduct regular children’s risk assessments, as I say. They will be required to assess the level and nature of the risk of individuals encountering illegal content on their service, to implement proportionate mitigations to protect people from illegal content, and to monitor them for effectiveness. Services likely to be accessed by children will also be required to assess the nature and level of risk of their service specifically for children to identify and implement proportionate mitigations to keep children safe, and to monitor them for effectiveness as well.

Companies will also need to assess how the design and operation of the service may increase or reduce the risks identified and Ofcom will have a duty to issue guidance to assist providers in carrying out their risk assessments. That will ensure that providers have, for instance, sufficient clarity about what an appropriate risk assessment looks like for their type of service.

The noble Lord, Lord Allan, and others asked about definitions and I congratulate noble Lords on avoiding the obvious

“To be, or not to be”


pun in the debate we have just had. The noble Lord, Lord Allan, is right in the definition he set out. On the rationale for it, it is simply that we have designated as category 1 the largest and riskiest services and as category 2 the smaller and less risky ones, splitting them between 2A, search services, and 2B, user-to-user services. We think that is a clear framework. The definitions are set out a bit more in the Explanatory Notes but that is the rationale.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I am grateful to the Minister for that clarification. I take it then that the Government’s working assumption is that all search services, including the biggest ones, are by definition less risky than the larger user-to-user services. It is just a clarification that that is their thinking that has informed this.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

As I said, the largest and riskiest sites may involve some which have search functions, so the test of large and most risky applies. Smaller and less risky search services are captured in category 2A.

Amendment 157 in the name of my noble friend Lord Pickles, and spoken to by the noble Baroness, Lady Deech, seeks to apply new duties on the largest search services. I agree with the objectives in my noble friend’s amendment of increasing transparency about the search services’ operations and enabling users to hold them to account. It is not, however, an amendment I can accept because it would duplicate existing duties while imposing new duties which we do not think are appropriate for search services.

As I say, the Bill will already require search services to set out how they are fulfilling their illegal content and child safety duties in publicly available statements. The largest search services—category 2A—will also be obliged to publish a summary of their risk assessments and to share this with Ofcom. That will ensure that users know what to expect on those search services. In addition, they will be subject to the Bill’s requirements relating to user reporting and redress. These will ensure that search services put in place effective and accessible mechanisms for users to report illegal content and content which is harmful to children.

My noble friend’s amendment would ensure that the requirements to comply with its publicly available statements applied to all actions taken by a search service to prevent harm, not just those relating to illegal content and child safety. This would be a significant expansion of the duties, resulting in Ofcom overseeing how search services treat legal content which is accessed by adults. That runs counter to the Government’s stated desire to avoid labelling legal content which is accessed by adults as harmful. It is for adult users themselves to determine what legal content they consider harmful. It is not for us to put in place measures which could limit their access to legal content, however distasteful. That is not to say, of course, that where material becomes illegal in its nature that we do not share the determination of the noble Baroness, my noble friend and others to make sure that it is properly tackled. The Secretary of State and Ministers have had extensive meetings with groups making representations on this point and I am very happy to continue speaking to my noble friend, the noble Baroness and others if they would welcome it.

I hope that that provides enough reassurance for the amendment to be withdrawn at this stage.

--- Later in debate ---
Moved by
14: Clause 6, page 5, line 38, at end insert—
“(6A) Providers of regulated user-to-user services are required to comply with duties under subsections (2) to (6) for each such service which they provide to the extent that is proportionate and technically feasible without making fundamental changes to the nature of the service (for example, by removing or weakening end-to-end encryption on an end-to-end encrypted service).”Member’s explanatory statement
This amendment is part of a series of amendments by Lord Clement-Jones intended to ensure risk assessments are not used as a tool to undermine users’ privacy and security.
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, I propose Amendment 14 on behalf of my noble friend Lord Clement-Jones and the noble Lord, Lord Hunt of Kings Heath, who are not able to be present today due to prior commitments. I notice that the amendment has been signed also by the noble Baroness, Lady Fox, who I am sure will speak to it herself. I shall speak to the group of amendments as a whole.

I shall need to speak at some length to this group, as it covers some quite complex issues, even for this Bill, but I hope that the Committee will agree that this is appropriate given the amendments’ importance. I also expect that this is one area where noble Lords are receiving the most lobbying from different directions, so we should do it justice in our Committee.

We should start with a short summary of the concern that lies behind the amendments: that the Bill, as drafted, particularly under Clause 110, grants Ofcom the power to issue technical notices to online services that could, either explicitly or implicitly, require them to remove privacy protections—and, in particular, that this could undermine a technology that is increasingly being deployed on private messaging services called end-to-end encryption. The amendments in this group use various mechanisms to reduce the likelihood of that being an outcome. Amendments 14 and 108 seek to make it clear in the Bill that end-to-end encryption would be out of scope—and, as I understand it, Amendment 205, tabled by the noble Lord, Lord Moylan, seeks to do something similar.

A second set of amendments would add in extra controls over the issuing of technical notices. While not explicitly saying that these could not target E2EE—if noble Lords will excuse the double negative—they would make it less likely by ensuring that there is more scrutiny. They include a whole series of amendments—Amendments 202 and 206, tabled by the noble Lord, Lord Stevenson, and Amendment 207—that have the effect of ensuring that there is more scrutiny and input into issuing such a notice.

The third set of amendments aim to ensure that Ofcom gives weight more generally to privacy and to all the actions it takes in relation to it. In particular, Amendment 190 talks about a broader privacy duty, and Amendment 285—which I think noble Lord, Lord Moylan, will be excited about—seeks to restrict general monitoring.

I will now dig into why this is important. Put simply, there is a risk that under the Bill a range of internet services will feel that they are unable to offer their products in the UK. This speaks to a larger question as we debate the measures in the Bill, as it can sometimes feel as though we are comfortable ratcheting up the requirements in the Bill under the assumption that services will have no choice but to meet them and carry on. While online services will not have a choice about complying if they wish to be lawfully present in the UK, they will be free to exit the market altogether if they believe that the requirements are excessively onerous or impossible to meet.

In the Bill, we are constructing, in effect, a de facto licensing mechanism, where Ofcom will contact in-scope services—the category 2A, category 2B, Part 3 and Part 5 services we discussed in relation to the previous group of amendments—will order them to follow all the relevant regulation and guidance and will instruct them to pay a fee for that supervision. We have to consider that some services, on receipt of that notice, will take steps to restrict access by people in the UK rather than agree to such a licence. Where those are rogue services, this reaction is consistent with the aims of the Bill. We do not want services which are careless about online safety to be present in the UK market. But I do not believe that it is our aim to force mainstream services out of the UK market and, if there is a chance of that happening, it should give us pause for thought.

As a general rule, I am not given to apocalyptic warnings, but I believe there is a real risk that some of the concerns that noble Lords will be receiving in their inboxes are genuine, so I want to unpick why that may be the case. We should reflect for a moment on the assumptions we may have about the people involved in this debate and their motivations. We often see tech people characterised as oblivious to harms, and security services people as uncaring about human rights. In my experience, both caricatures are off the mark, as tech people hate to see their services abused and security service representatives understand that they need to be careful about how they exercise the great powers we have given them. We should note that, much of the time, those two communities work well together in spaces such the Global Internet Forum to Counter Terrorism.

If this characterisation is accurate, why do I think we may have a breakdown over the specific technology of end-to-end encryption? To understand this subject, we need to spend a few moments looking at trends in technology and regulation over recent years. First, we can look at the growth of content-scanning tools, which I think may have been in the Government’s mind when they framed and drafted the new Clause 110 notices. As social media services developed, they had to consider the risks of hosting content on the services that users had uploaded. That content could be illegal in all sorts of ways, including serious forms, such as child sexual abuse material and terrorist threats, as well as things such as copyright infringement, defamatory remarks and so on. Platforms have strong incentives to keep that material off their servers for both moral and legal reasons, so they began to develop and deploy a range of tools to identify and remove it. As a minimum, most large platforms now deploy systems to capture child sexual abuse material and copyright-infringing material, using technologies such as PhotoDNA and Audible Magic.

--- Later in debate ---
Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I point out that one of the benefits of end-to-end encryption is that it precisely stops companies doing things such as targeted advertising based on the content of people’s communications. Again, I think there is a very strong and correct trend to push companies in that direction.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the noble Lord for the intervention. For those noble Lords who are not following the numbers, Amendment 285, which I support, would prevent general monitoring. Apart from anything else, I am worried about equivalence and other issues in relation to general monitoring. Apart from a principled position against it, I think to be explicit is helpful.

Ofcom needs to be very careful, and that is what Amendment 190 sets out. It asks whether the alternatives have been thought about, whether the conditions have been thought about, and whether the potential impact has been thought about. That series of questions is essential. I am probably closer to the community that wants to see more powers and more interventions, but I would like that to be in a very monitored and regulated form.

I thank the noble Lord for his contribution. Some of these amendments must be supported because it is worrying for us as a country to have—what did the noble Lord call it?—ambiguity about whether something is possible. I do not think that is a useful ambiguity.

--- Later in debate ---
Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

I agree with the noble Baroness, which is precisely why I am suggesting that we need to consider whether privacy should be sacrificed totally in relation to the argument around encryption. It is difficult, and I feel awkward saying it. When I mentioned a silver bullet I was not talking about the noble Baroness or any other noble Lords present, but I have heard people say that we need this Bill because it will deal with child abuse. In this group of amendments, I am raising the fact that when I have talked about encryption with people outside of the House they have said that we need to do something to tackle the fact that these messages are being sent around. It is not just child abuse; it is also terrorism. There is a range of difficult situations.

Things can go wrong with this, and that is what I was trying to raise. For example, we have a situation where some companies are considering using, or are being asked to use, machine learning to detect nudity. Just last year, a father lost his Google account and was reported to the police for sending a naked photo of their child to the doctor for medical reasons. I am raising these as examples of the problems that we have to consider.

Child abuse is so abhorrent that we will do anything to protect children, but let me say this to the Committee, as it is where the point on privacy lies: children are largely abused in their homes, but as far as I understand it we are not as yet arguing that the state should put CCTV cameras in every home for 24/7 surveillance to stop child abuse. That does not mean that we are glib or that we do not understand the importance of child abuse; it means that we understand the privacy of your home. There are specialist services that can intervene when they think there is a problem. I am worried about the possibility of putting a CCTV camera in everyone’s phone, which is the danger of going down this route.

My final point is that these services, such as WhatsApp, will potentially leave the UK. It is important to note that. I agree with the noble Lord, Lord Allan: this is not like threatening to storm off. It is not done in any kind of pique in that way. In putting enormous pressure on these platforms to scan communications, we must remember that they are global platforms. They have a system that works for billions of people all around the world. A relatively small market such as the UK is not something for which they would compromise their billions of users around the world. As I have explained, they would not put up with it if the Chinese state said, “We have to see people’s messages”. They would just say, “We are encrypted services”. They would walk out of China and we would all say, “Well done”. There is a real, strong possibility of these services leaving the UK so we must be very careful.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I just want to add to the exchange between the noble Baronesses, Lady Kidron and Lady Fox. The noble Baroness, Lady Fox, referred to WhatsApp’s position. Again, it is important for the public out there also to understand that if someone sends them illegal material—in particular child sexual abuse material; I agree with the noble Baroness, Lady Kidron, that this is a real problem—and they report it to WhatsApp, which has a reporting system, that material is no longer encrypted. It is sent in clear text and WhatsApp will give it to the police. One of the things I am suggesting is that, rather than driving WhatsApp out of the country, because it is at the more responsible end of the spectrum, we should work with it to improve these kinds of reporting systems and put the fear of God into people so that they know that this issue is not cost-free.

As a coda to that, if you ever receive something like that, you should report it to the police straightaway because, once it is on your phone, you are liable and you have a problem. The message from here should be: if you receive it, report it and, if it is reported, make sure that it gets to the police. We should be encouraging services to put those systems in place.

Baroness Fox of Buckley Portrait Baroness Fox of Buckley (Non-Afl)
- Hansard - - - Excerpts

The noble Lord has concluded with my conclusion, which was to say that those services will be driven out, but not because they are irresponsible around horrible, dangerous messages. They do not read our messages because they are private. However, if we ever receive anything that makes us feel uncomfortable, they should be put under pressure to act. Many of them already do and are actually very responsible, but that is different from demanding that they scan our messages and we breach that privacy.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

Yes, I think it is right. The investigatory powers Act is a tool for law enforcement and intelligence agencies, whereas the Bill is designed to regulate technology companies—an important high-level distinction. As such, the Bill does not grant investigatory powers to state bodies. It does not allow the Government or the regulator to access private messages. Instead, it requires companies to implement proportionate systems and processes to tackle illegal content on their platforms. I will come on to say a little about legal redress and the role of the courts in looking at Ofcom’s decisions so, if I may, I will respond to that in a moment.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - -

The investigatory powers Act includes a different form of technical notice, which is to put in place surveillance equipment. The noble Lord, Lord Stevenson, has a good point: we need to ensure that we do not have two regimes, both requiring companies to put in place technical equipment but with quite different standards applying.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I will certainly take that point away and I understand, of course, that different Acts require different duties of the same platforms. I will take that away and discuss it with colleagues in other departments who lead on investigatory powers.

--- Later in debate ---
Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I am about to talk about the safeguards for journalists in the context of the Bill and the questions posed by the noble Baroness, Lady Bennett. However, I take my noble friend’s point about the implications of other Acts that are already on the statute book in that context as well.

Just to finish the train of thought of what I was saying on Amendment 202, making a reference to encryption, as it suggests, would be out of step with the wider approach of the Bill, which is to remain technology-neutral.

I come to the safeguards for journalistic protections, as touched on by the noble Baroness, Lady Bennett. The Government are fully committed to protecting the integrity of journalistic sources, and there is no intention or expectation that the tools required to be used under this power would result in a compromising of those sources. Any tools required on private communications must be accredited by Ofcom as highly accurate only in detecting child sexual abuse and exploitation content. These minimum standards of accuracy will be approved and published by the Secretary of State, following advice from Ofcom. We therefore expect it to be very unlikely that journalistic content will be falsely detected by the tools being required.

Under Clause 59, companies are obliged to report child sexual abuse material which is detected on their service to the National Crime Agency; this echoes a point made by the noble Lord, Lord Allan, in an earlier contribution. That would include child sexual abuse and exploitation material identified through tools required by a notice and, even in this event, the appropriate protections in relation to journalistic sources would be applied by the National Crime Agency if it were necessary to identify individuals involved in sharing illegal material.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

I want to flag that in the context of terrorist content, this is quite high risk for journalists. It is quite common for them, for example, to be circulating a horrific ISIS video not because they support ISIS but because it is part of a news article they are putting together. We should flag that terrorist content in particular is commonly distributed by journalists and it could be picked up by any system that is not sufficiently sophisticated.

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I see that my noble friend Lord Murray of Blidworth has joined the Front Bench in anticipation of the lunch-break business for the Home Office. That gives me the opportunity to say that I will discuss some of these points with him, my noble friend Lord Sharpe of Epsom and others at the Home Office.

Amendment 246 aims to ensure that there is no requirement for a provider to comply with a notice until the High Court has determined the appeal. The Government have ensured that, in addition to judicial review through the High Court, there is an accessible and relatively affordable alternative means of appealing Ofcom’s decisions via the Upper Tribunal. We cannot accept amendments such as this, which could unacceptably delay Ofcom’s ability to issue a notice, because that would leave children vulnerable.

To ensure that Ofcom’s use of its powers under Clause 110, and the technology that underpins it, are transparent, Ofcom will produce an annual report about the exercise of its functions using these powers. This must be submitted to the Secretary of State and laid before Parliament. The report must also provide the details of technology that has been assessed as meeting minimum standards of accuracy, and Ofcom may also consider other factors, including the impact of technologies on privacy. That will be separate to Ofcom’s annual report to allow for full scrutiny of this power.

The legislation also places a statutory requirement on Ofcom to publish guidance before its functions with regard to Clause 110 come into force. This will be after Royal Assent, given that the legislation is subject to change until that point. Before producing the guidance, Ofcom must consult the Information Commissioner. As I said, there are already strong safeguards regarding Ofcom’s use of these powers, so we think that this additional oversight is unnecessary.

Amendments 203 and 204, tabled by the noble Lord, Lord Clement-Jones, seek to probe the privacy implications of Ofcom’s powers to require technology under Clause 110. I reiterate that the Bill will not ban or weaken any design, including end-to-end encryption. But, given the scale of child sexual abuse and exploitation taking place on private communications, it is important that Ofcom has effective powers to require companies to tackle this abhorrent activity. Data from the Office for National Statistics show that in nearly three-quarters of cases where children are contacted online by someone they do not know, this takes place by private message. This highlights the scale of the threat and the importance of technology providers taking steps to safeguard children in private spaces online.

As already set out, there are already strong safeguards regarding the use of this power, and these will prevent Ofcom from requiring the use of any technology that would undermine a platform’s security and put users’ privacy at risk. These safeguards will also ensure that platforms will not be required to conduct mass scanning of private communications by default.

Until the regime comes into force, it is of course not possible to say with certainty which tools would be accredited. However, some illustrative examples of the kinds of current tools we might expect to be used—providing that they are highly accurate and compatible with a service’s design—are machine learning or artificial intelligence, which assess content to determine whether it is illegal, and hashing technology, which works by assigning a unique number to an image that has been identified as illegal.

Given the particularly abhorrent nature of the crimes we are discussing, it is important that services giving rise to a risk of child sexual abuse and exploitation in the UK are covered, wherever they are based. The Bill, including Ofcom’s ability to issue notices in relation to this or to terrorism, will therefore have extraterritorial effect. The Bill will apply to any relevant service that is linked to the UK. A service is linked to the UK if it has a significant number of UK users, if UK users form a target market or if the service is capable of being used in the UK and there is a material risk of significant harm to individuals in the UK arising from the service. I hope that that reassures the noble Lord, on behalf of his noble friend, about why that amendment is not needed.

Amendments 209 to 214 seek to place additional requirements on Ofcom to consider the effect on user privacy when using its powers under Clause 110. I agree that tackling online harm needs to take place while protecting privacy and security online, which is why Ofcom already has to consider user privacy before issuing notices under Section 110, among the other stringent safeguards I have set out. Amendment 202A would impose a duty on Ofcom to issue a notice under Clause 110, where it is satisfied that it is necessary and proportionate to do so—this will have involved ensuring that the safeguards have been met.

Ofcom will have access to a wide range of information and must have the discretion to decide the most appropriate course of action in any particular scenario, including where this action lies outside the powers and procedures conferred by Clause 110; for instance, an initial period of voluntary engagement. This is an in extremis power. It is essential that we balance users’ rights with the need to enable a strong response, so Ofcom must be able to assess whether any alternative, less intrusive measures would effectively reduce the level of child sexual exploitation and abuse or terrorist content occurring on a service before issuing a notice.

I hope that that provides reassurance to noble Lords on the amendments in this group, and I invite the noble Lord to withdraw Amendment 14.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- Hansard - -

My Lords, this has been a very useful debate and serves as a good appetite builder for lunch, which I understand we will be able to take shortly.

I am grateful to the Minister for his response and to all noble Lords who have taken part in the debate. As always, the noble Baroness, Lady Kidron, gave us a balanced view of digital rights—the right to privacy and to security—and the fact that we should be trying to advance these two things simultaneously. She was right again to remind us that this is a real problem and there is a lot we can do. I know she has worked on this through things such as metadata—understanding who is communicating with whom—which might strike that nice balance where we are not infringing on people’s privacy too grossly but are still able to identify those who wish harm on our society and in particular on our children.

The noble Baroness, Lady Bennett, was right to pick up this tension between everything, everywhere, all at once and targeted surveillance. Again, that is really interesting to tease out. I am personally quite comfortable with quite intrusive targeted surveillance. I do not know whether noble Lords have been reading the Pegasus spyware stories: I am not comfortable with some Governments placing such spyware on the phones of human rights defenders but I would be much more relaxed about the British authorities placing something similar on the phones of people who are going to plant bombs in Manchester. We need to be really honest about where we are drawing our red lines if we want to go in the direction of targeted surveillance.

The noble Lord, Lord Moylan, was right again to remind us about the importance of private conversations. I cited the example of police officers whose conversations have been exposed. Although it is hard, we should remember that if ordinary citizens want to exchange horrible racist jokes with each other and so on in private groups that is not a matter for the state, but it is when it is somebody in a position of public authority; we have a right to intervene there. Again, we have to remember that as long as it is not illegal people can say horrible things in private, and we should not encourage any situation where we suggest that the state would interfere unless there are legitimate grounds—for example, it is a police officer or somebody is doing something that crosses the line of legality.

The noble Baroness, Lady Fox, reminded us that it is either encrypted or it is not. That is really helpful, as things cannot be half encrypted. If a service provider makes a commitment it is critical that it is truthful. That is what our privacy law tells us. If I say, “This service is encrypted between you and the person you send the message to”, and I know that there is somebody in between who could access it, I am lying. I cannot say it is a private service unless it is truly private. We have to bear that in mind. Historically, people might have been more comfortable with fudging it, but not in 2023, when have this raft of privacy legislation.

The noble Baroness is also right to remind us that privacy can be safety. There is almost nothing more devastating than the leaking of intimate images. When services such as iCloud move to encrypted storage that dramatically reduces the risk that somebody will get access to your intimate images if you store them there, which you are legally entitled to do. Privacy can be a critical part of an individual maintaining their own security and we should not lose that.

The noble Baroness, Lady Stowell, was right again to talk about general monitoring. I am pleased that she found the WhatsApp briefing useful. I was unable to attend but I know from previous contact that there are people doing good work and it is sad that that often does not come out. We end up with this very polarised debate, which my noble friend Lord McNally was right to remind us is unhelpful. The people south of the river are often working very closely in the public interest with people in tech companies. Public rhetoric tends to focus on why more is not being done; there are very few thanks for what is being done. I would like to see the debate move a little more in that direction.

The noble Lord, Lord Knight, opened up a whole new world of pain with VPNs, which I am sure we will come back to. I say simply that if we get the regulatory frameworks right, most people in Britain will continue to use mainstream services as long as they are allowed to be offered. If those services are regulated by the European Union under its Digital Services Act and pertain to the UK and the US in a similar way, they will in effect have global standards, so it will not matter where you VPN from. The scenario the noble Lord painted, which I worry about, is where those mainstream services are not available and we drive people into small, new services that are not regulated by anyone. We would then end up inadvertently driving people back to the wild west that we complain about, when most of them would prefer to use mainstream services that are properly regulated by Ofcom, the European Commission and the US authorities.