(9 months, 2 weeks ago)
Grand CommitteeMy Lords, I thank the Minister for that explanation. I have to say that my recollection is that the issue is much wider than the exemption and ensuring that there is no tip-off to somebody who is about to be visited by immigration enforcement. Let me give an example that was borne out after the Act was passed: solicitors acting for data subjects were unable, as we had anticipated, to find out what the Home Office thought it knew—I put it that way deliberately —about their clients.
I have some general points to make; I will do so fairly quickly. It would be optimistic to think that the Home Office had taken from this saga that objections and criticisms—in the form of amendments, obviously—can be helpful because we could have avoided a lot of effort in rectification. My noble friend Lord Clement-Jones will go into some of the history; I must admit, I do not recall much detail except for being teased frequently by the noble Baroness, Lady Williams, when she was the Home Office Minister, because I brought up our objection to the immigration exemption so often.
I feel strongly that it should not have to be for non-governmental organisations that are no doubt strapped for cash to do so much in order to get things right. I appreciate that that is part of our democracy; I do not object at all to the fact that they can do so, of course, but they should not have to. An application, an appeal, another judicial review, another appeal—at what cost to those organisations and the taxpayer! I emphasise that there is an exclamation mark, not a question mark, at the end of that sentence.
This saga is one of those episodes that vindicates the role of the courts, often in language that I, for one, relish. We have spent a lot of time in the Chamber recently discussing the role of the courts in our constitution; to give one example of the language, I really liked the understated use of
“over-broad derogations from fundamental rights”.
As the Minister said, the litigants were consulted before the publication of the SI. The Secondary Legislation Scrutiny Committee reports that it made three points, of which one, on oversight, was rejected by the Home Office and one was regarded by the Home Office as not necessary. Can the Minister tell the Committee what these were and why they were not pursued?
On the detail of the instrument, I note that it will be a matter for the Secretary of State to balance the risks to the individual and the risks to the state. I happen to think that it is in the public interest to apply exemptions with a very light touch, but of course it is no secret that the Liberal Democrats have problems with the Home Office’s immigration policy, and I fear that the reputational ship is well on its way. Clearly, there is an imbalance of power. That is inevitable, but it is not easy for the individual data subject to exercise his rights, and we should be aware of that.
Can the Minister also tell us what the Home Office will do to ensure that there will be transparency of decisions so that it can appropriately be held to account? Mechanisms must be written into the procedures. New paragraph 4B of Schedule 2 provides for a record of decisions and reasons. How will that be published and what will happen to it?
Will the Minister also comment on the capacity of immigration enforcement—and whoever else needs to—to look at prospective decisions on a case-by-case basis for each disapplication? I recognise that that will not necessarily be a straightforward and easy exercise, but it certainly requires a great deal more than, “It’s okay; it’s immigration, so we can just rely on the exemption”. Case-by-case decision-making is very important.
Finally, I note that the Explanatory Memorandum tells us that there is no full impact assessment because the instrument
“does not substantively alter the safeguards and considerations for applying the Immigration Exemption”.
I have to say that I thought that was the point.
My Lords, this set of regulations is a step forward, but with all the caveats that my noble friend made, and I have some more.
As the Minister confirmed, these regulations are the result of the Open Rights Group case—the Court of Appeal judgment in the3million & Anor, R (on the application of) v Secretary of State for the Home Department & Anor—which confirms the earlier High Court judgment in March 2023. In broad terms, the Court of Appeal found that the immigration exemption in Schedule 2 to the Data Protection Act 2018 conflicted with the safeguards in Article 23 of the UK GDPR, as the Minister said. This was because the immigration exemption was drafted too broadly and failed to incorporate the safeguards prescribed for exemptions under Article 23 of the UK GDPR. It was therefore held to be unlawful and was disapplied.
These regulations follow two previous attempts by the Home Office to craft an immigration exemption which contained sufficient safeguards to satisfy the requirements set out in Article 23 of the UK GDPR. This is the third shot at it. In order to make the immigration exemption compatible with the requirements of Article 23, as the Minister explained, the Government added a number of safeguards to the exemption which were not there before. These are set out in the regulations. They are worth stating because they are really important requirements, which were omitted previously.
They include requirements to: make decisions on the application of the exemption on a case-by-case basis; make separate decisions in respect of each of the relevant UK GDPR provisions which relates to the data subject; make fresh decisions on each occasion where there is consideration or restriction of any of the relevant UK GDPR provisions in relation to the data subject; take into account all the circumstances of the case, including the potential vulnerability of the data subject, and so on; and apply the exemption only if the application of the particular UK GDPR provision would give rise to a substantial risk of prejudice that outweighs the risk of prejudice to the interests of the data subject, ensuring that the application of the exemption is necessary and proportionate to the risks in the particular case.
You would think it rather extraordinary that those are excluded from the previous regulations. In addition, a record must be made of the decision to apply the exemption, together with the reasons for that decision. There is also a rebuttable presumption that the data subject will be informed of the use of the exemption.
The ICO welcomed them in its letter to the Home Office as, in its view, satisfying the requirements of the Open Rights Group case. In its view, the proposed changes will ensure that the exemption complies with Article 23(2) of the UK GDPR and ensure that there are appropriate safeguards to protect individuals. Since it took part in the case as an interested party, this is of considerable reassurance. I congratulate the Open Rights Group and the3million on not one but two notable successes in court cases which have forced the Home Office to amend the exemption twice.
I thank all noble Lords for their contributions. I shall start with justification and the public interest, which is obviously at the core of this. Parliament included the immigration exemption as part of the Data Protection Act 2018, as has been noted, for the legitimate purpose of effective immigration control. The Court of Appeal declared in its judgment,
“that there can be no dispute that the Immigration Exemption has a legitimate aim and indeed seeks to advance important public interests.”
We agree with the court: the immigration exemption is vital to prevent the release of information which would otherwise prejudice effective immigration control. I particularly welcome its endorsement by the noble Lord, Lord Coaker.
I want to be clear with noble Lords what those important public interests are. Through targeted use of the immigration exemption, we are able to maintain our capability at the border to prevent criminals and those who seek to cause us harm threatening our country as well as to support other agencies and international partners. We are able to frustrate and prevent sham marriages and protect the integrity of ongoing immigration removal and enforcement action and forgery investigations. The immigration exemption is also used to protect people being forced into a marriage and to prevent individuals absconding when there is a planned immigration visit. The central aims are to protect our citizens, ensure the integrity of the border and prevent abuses of the immigration system.
The noble Lord, Lord Coaker, asked about the balancing test. I will come on to the use of the exemption in practice, but it is always clear that the balancing test has to be carried out, and will now be explicitly in the Act. In practice, I can reassure noble Lords that the exemption is employed at around 70% of subject access requests relating to immigration and the Border Force. The amount of data that is restricted by the use of the exemption is, in the vast majority of cases, very little. It is not simply the case that where one piece of information is found to be prejudicial to immigration control, the Home Office does not respond to a request. The piece of information may be redacted as a result, but otherwise a full response will be given. It must be both necessary and proportionate to use the exemption, and this must be balanced against the risk to an individual’s rights. These existing standards will now be set out explicitly in the legislation.
I acknowledge that there was a difference of opinion in the House over whether the previous regulations amending the immigration exemption in 2022 met the requirements of Article 23 of the UK GDPR. The courts have agreed with the Government on a wide range of issues in the hearing. They declared that in two areas in particular the amended exemption did not, and the Government respect that ruling. We are confident that these regulations meet the requirements of the judgment in full, and we are supported by the ICO in that opinion.
The noble Baroness, Lady Hamwee, asked whether we consulted the claimants. They were consulted as part of the development of the provisions, and they suggested some additions to the provisions. We accepted suggestions to provide detail on applicable storage periods in the Explanatory Memorandum. We did not accept a suggestion to alter the existing model of ICO oversight of the exemption. The existing model of ICO oversight of the Home Office is robust, and data subjects are able to challenge use of the exemption. I welcome the noble Lord, Lord Clement-Jones, acknowledging the ICO’s part in this.
We also rejected the suggestion to specify in the legislation the wording that must be provided to data subjects when informing them that the provisions of the exemption have been applied. The provisions of the exemption are already accessible to data subjects and adding that detail to primary legislation would be unhelpful.
As regards how the ICO assesses the Government’s use of the immigration exemption, it already assesses the Home Office as part of its statutory role as regulator. Those assessments are published as data protection audit reports, setting out the findings and any recommendations. Should a data subject disagree with the decision to apply the immigration exemption in their case, the usual redress mechanisms to contact the ICO are available.
The noble Lord, Lord Coaker, asked about the application of these rules to children. The immigration exemption applies to all immigration data, but there are special considerations in relation to minors, which are set out in the ICO’s guidance.
The subject of an impact assessment also came up, which relates to oversight and transparency more generally. It is important that these regulations retain the presumption that a data subject should be informed that the immigration exemption has been used—for example, to redact information provided to them in response to a subject access request. That allows the data subject to challenge that decision, should they believe that the application of the exemption is not justified. The ICO has appropriate powers to investigate whether the immigration exemption has been applied appropriately in a specific case. This is in addition to its overall assessment of the Home Office’s data protection practices, which include the use of the immigration exemption more broadly.
An impact assessment was carried out as part of the inclusion of the provision for the immigration exemption in the Data Protection Act 2018. A further supplementary impact assessment was conducted as part of the amendment to the exemption by the SI in 2022. This is noted in the Explanatory Memorandum. Given that there is no substantive change to the safeguards and scope of the exemption, we have not completed a new IA for this instrument.
I am sorry; the Minister seems to be moving on from the impact issue. Clearly there was a period when the old regulation, which is now being superseded, was in operation and individuals were impacted. In a sense, an inappropriate exemption was used. What data does the Minister have about those individuals and the impact on them? What redress do they have? The Minister skated over the ICO’s redress mechanism. Is there no direct mechanism to the Home Office?
I did not skate over it at all; I referred to it explicitly and am happy to do so again, if it would help. I do not know if there is any specific redress to the Home Office. I would imagine not, given that it is explicit that data subjects should go via the ICO. If I am wrong on that, I will clarify.
I have no particular data on the subjects who may have been covered by this before the court’s decision, so I will have to find out, come back and write to the noble Lord if there is anything useful to add.
The Home Office already has relevant guidance and training in place for those exercising the immigration exemption provisions, but we are undertaking a review of those materials to ensure that they align with these regulations. That will be completed in time for the 11 March deadline to amend the current exemption. The instrument is making existing safeguards explicit in the legislation, which are already captured in the existing training and guidance, so we do not expect substantive changes to be needed.
The costs of the court case are not yet settled, but I am happy to commit to write once they have been.
There are a couple more bits to say. How often is the exemption used? The honest answer is not very often. I think I referred to this earlier, so it is probably redundant to say it again but, for the record, in the year ending October 2023, the immigration exemption was applied in around 70% of subject access requests received in relation to immigration citizenship and the Border Force. Of those, the vast majority had only a small amount of data redacted under the use of the exemption. So I suppose the answer to the noble Lord’s question is that it will have a very minimal impact on people, but I commit to clarify that.
Finally, the noble Lord, Lord Clement-Jones, asked about the relationship between the DPA and retained EU law. The official answer is that the focus of this SI is the immigration exemption and that discussions of the rules and the implications for the DPA 2018 are probably best debated as part of the DPDI Bill, which will, I believe, come to the House on 20 March. The unofficial answer is that I cannot comment on the noble Lord’s disposition because I did not really understand it and I do not have much knowledge of this subject. However, I note that we have left the EU: the people voted. Our rules can now be amended to our own circumstances, and of course, that applies across the entire legal suite. It was a pretty clear vote by the people of this country; I know that that does not suit the Liberal Democrats.
In closing, I hope that I have satisfactorily answered the points that were made and that noble Lords understand the necessity—
(10 months, 1 week ago)
Lords ChamberTo ask His Majesty’s Government what action they are taking to reform the Computer Misuse Act 1990 to enable legitimate independent testing of computer systems.
My Lords, the Government support people undertaking legitimate cybersecurity work to do so without fear of criminalisation. We are actively considering options to strengthen the legislative framework as part of the review of the Computer Misuse Act, which is ongoing. This work is complex and needs a lot of thought, not least to ensure that we do not inadvertently create a loophole that can be exploited by cybercriminals or hostile state actors.
My Lords, the need to be able to carry out independent research into computer systems has been put into the spotlight by the Horizon scandal. We last discussed this issue at Oral Questions last July. Since then, the Government have had the conclusions of a stakeholder working group for several months but have done absolutely nothing to include a public interest defence in the Criminal Justice Bill that is now in the Commons. I described the Government’s progress last year as “glacial”. Was I being unkind to glaciers?
Regrettably, the noble Lord is wrong. We set up a multistakeholder group of systems owners, law enforcement, cybersecurity companies and prosecutors—a systems access group—to specifically consider the proposal of statutory defences. Six meetings were held between May 2023 and October 2023. Unfortunately, there is a lack of consensus among those participants and the cybersecurity industry, and with law enforcement and prosecutors, on whether there is a need for statutory defences and on what is considered to be legitimate activity. That lack of consensus proves the point that careful thought is needed in this area.
My Lords, we are always interested in learning from the approaches taken by other countries and jurisdictions. We speak with our international counterparts, including all our major allies, to understand how they approach the issue of whether there should be defences to these types of offences. But the majority of our like-minded partners do not have statutory defences and are instead in favour of prosecutorial guidance. For example, the US Department of Justice introduced guidance for prosecutors on when to prosecute instances of potential breaches of its Computer Fraud and Abuse Act.
My Lords, does the Minister agree that the Criminal Justice Bill is a good opportunity for the Government to bring forward a public interest amendment, perhaps with the bells and whistles that the Minister is talking about, or is he firmly of the view that this will occur only in the future?
My Lords, I am not quite sure where the bells and whistles come from. As I said, we are just considering all the potential implications. However, part of the Criminal Justice Bill introduces a new power for law enforcement and other investigative agencies to suspend IP addresses and domain names where they are being used to facilitate serious crime. So the answer is partially yes, but the other situation that the noble Lord described is very complicated.
(1 year, 3 months ago)
Lords ChamberMy Lords, the noble Lord will be aware that the City of London Police partially fulfils that function. It prioritised investigators to the City of London as part of its recent increase in the numbers of police. Angela McLaren, the commissioner there, has a strong background in economic crime and its investigation, and the City of London Police runs an economic crime academy. The noble Lord makes an interesting point about having just one agency, but that agency is the National Economic Crime Centre, which co-ordinates all the various activities across the various police forces, including regional organised crime units.
My Lords, given that the UK cyber industry plays a critical role in supporting law enforcement to tackle cyber-enabled fraud, when will the Government reform the Computer Misuse Act so that the cyber industry does not face legal jeopardy for protecting our citizens and businesses online? Is it not high time that the Home Office came to a conclusion on its review?
My Lords, I cannot speculate on that Act but the anti-fraud champion, Anthony Browne MP, has been having some close engagement with industry. An online sector charter—which I appreciate is not entirely the same thing but is certainly related—is due to be published in the autumn, so we should watch and wait for that.
(1 year, 5 months ago)
Lords ChamberTo ask His Majesty’s Government what progress has been made in implementing the recommendations on cybersecurity made by Sir Patrick Vallance in his report Pro-innovation Regulation of Technologies Review: Digital Technologies, published in March.
My Lords, in the Government’s response to the review, we set out that the Home Office is taking forward work to consider the merits and risks of the proposals made. We have created a group that includes law enforcement agencies, prosecutors, the cybersecurity industry and system owners to consider these issues and reach a consensus on the best way forward.
My Lords, Sir Patrick made a very clear recommendation to amend the Computer Misuse Act to include a statutory public interest defence for cybersecurity researchers and professionals carrying out threat intelligence research. This has been extremely long awaited. We finally had a review, which started in 2021 and reported this year; we had a consultation, which concluded in April; and now we have the steps that the Minister talked about. What conclusion can we expect at the end of the day? Progress on this has been totally glacial given the importance to innovation and growth of this change to legislation.
My Lords, I agree that there is an enormous necessity to get this right, but that is part of the problem of why things are perhaps not happening as fast as the noble Lord would like—progress is far from glacial. These issues are incredibly complicated because, as the noble Lord noted, the proposals would potentially allow a defence for the unauthorised access by a person to another’s property, and in this case their computer systems and data, without their knowledge and consent. We therefore need to define what constitutes legitimate cybersecurity activity, where a defence might be applicable and under what circumstances, and how such unauthorised access can be kept to a minimum. We also need to consider who should be allowed to undertake such activity, what professional standards they will need to comply with, and what reporting or oversight will be needed. In short, these are complex matters, and it is entirely right to try to seek a consensus among the agencies I mentioned earlier.
The noble Viscount makes a good point. I am obviously unable to comment on the scheduling of parliamentary business but, when the group that I referred to in my initial Answer has finished its consultations and considerations and come to a consensus, we will of course report back to Parliament. I imagine that will include a debate.
My Lords, does not everything that has been said on this Question today demonstrate the importance of fresh intelligence work and, therefore, the importance of changing the Computer Misuse Act?
I do not think that anybody disagrees with that. I am just saying that we need to get it right and do it properly.
(1 year, 11 months ago)
Lords ChamberMy Lords, I apologise for popping up at this point, not having taken part in the debates so far, but I was requested to do so by the British Academy, the UK’s national academy of humanities and social sciences, of which I am proud to be a fellow. I am also an academic who has in the past collaborated with colleagues from outside the UK in the area of social policy, which of course is trying to influence government.
I am sure I do not need to spell out the importance of international research collaboration, which was touched on by my noble friend Lord Stansgate, especially in the wake of the Science Minister’s speech last week which emphasised the importance of the Government’s global science strategy. Any such strategy requires international collaboration. The British Academy accepts that mechanisms to prevent foreign interference are necessary, but such mechanisms must safeguard the benefits of international research and protect academic freedom. It is worth just noting here what the Joint Committee on Human Rights had to say. It was concerned that this was introduced at such a late stage of the Bill’s passage that it could not comment properly on it, but it said:
“Any foreign influence registration scheme must contain adequate protections to ensure that it does not interfere unduly with democratic rights, including freedom of association and free speech.”
I think everything we have heard so today, other than from the Minister, suggests that it could interfere in that way.
Indeed, the British Academy argues that such mechanisms exist already and that FIRS would duplicate them in a way that creates totally unnecessary bureaucracy, which surely this Government, of all Governments, want to avoid. It is not helped by the lack of clarity in the wording, which was referred to by the noble Lord, Lord Wallace of Saltaire, with details left for secondary legislation. The effect, the British Academy argues, would be a significant negative impact on the ability of UK researchers to engage internationally, creating irreversible harm to the UK’s research and innovation standing. The academy is not prone to hyperbole.
As currently drafted, as we have heard, FIRS would entangle wide swathes of international activities and is likely to have a chilling effect on international collaboration, not just deterring those with malign intent—as referred to by the Minister—but probably having a much greater impact on those with utterly benign intent. I cannot believe for a moment that this is what the Government want, especially given that it would undermine their own aspirations to forge a global science strategy.
It is in the Government’s own interest to accept the British Academy’s recommendation that they withdraw Part 3—I think I am echoing what the noble Lord, Lord Carlile, said—and consult with it and other relevant organisations to cocreate a framework that is proportionate and reasonable, taking into account existing reporting and oversight mechanisms. The academy argues that research and innovation should be largely excluded from FIRS. Is this something that the Government are willing to consider? If not, why not? Will the Minister agree to take this away, have discussions with the British Academy and others and, ideally, withdraw Part 3 altogether as has been suggested or, at the very least, come up with something less harmful before Report? I am echoing other noble Lords in calling for a longer pause than currently envisaged. The more I have listened to today’s debate, the more horrified I have become at what this part of the Bill might mean.
My Lords, I rise to speak to Amendment 103, and I declare my interests as set out in the register.
Like the noble Baronesses, Lady Noakes and Lady Lister, I am new to the Bill and have been provoked by briefings. Like others who have spoken today, I emphasise that I am absolutely no fan of this foreign influence registration scheme, which is far too broad in its application, as we have heard. I think it will be highly damaging to UK research and development, inward investment and British interests around the world. The noble Baroness, Lady Hayter, listed those who might get caught up in the scheme, and clearly very few of those have any connection at all with national security. I am delighted to support many amendments in this group and, in particular, the clause stand part notices that the noble Lords, Lord Anderson of Ipswich and Lord Carlile of Berriew, and my noble friend Lord Wallace have spoken to so cogently.
This has given us the opportunity to debate the flawed nature of the whole scheme. I will make some remarks about the impact on business and investment, which my noble friend Lord Fox would have made were he able to be here. We have heard powerful testimony from the British Academy, referred to by the noble Baroness, Lady Lister, and from the Russell group, referred to by the noble Viscount, Lord Stansgate, about the hugely detrimental potential impact of the Bill on the international research and development front. The British Academy rightly says that international collaboration is critical to the excellence of UK research and the Government’s aim to become a scientific and global science superpower. As it says, as currently drafted the FIRS will have a severely negative impact on the UK’s ability to engage with researchers internationally and on the ability of researchers in the humanities and social sciences to engage on critical public policy topics, and it will irrevocably harm the UK’s research and innovation standing. Strong words.
Under the scheme as currently proposed, at minimum, research universities will be smothered in red tape and, at worst, heavy criminal penalties in undertaking international research partnerships will be imposed. Bluntly, I must tell the Minister that his amendments add very little to the clarity of this scheme. The Minister’s letter about the intersection with the National Security and Investment Act, which we debated in 2021, was far from convincing. There is already a raft of other legislation relating to the academic technology approval scheme and export control, which impact on a university’s international activities. If this scheme, by mischance, does go through, it makes Amendment 104, in the name of my noble friend Lord Wallace, the absolute bare minimum needed. Both the Russell group and the British Academy make the case for clarity, non-duplication, proportionality and a high threshold for registration, none of which is currently present in the scheme.
A further cause for withdrawal of this scheme is the strong reaction from the business and investment community. That is why this stand part debate is so important. The ABI states very clearly that the current proposal for the FIRS
“risks placing significant reporting burden on insurers and long-term savings providers investing in the UK, with the potential to negatively impact the UK’s international competitiveness and attractiveness as a place to invest”.
TheCityUK says these proposals
“if passed unamended would have a chilling effect on inward investment into the UK”.
My Lords, I thought I was very clear on the precise specified persons tier here. A UK university would need to be acting at the direction of a specified foreign power or a specified foreign power-controlled entity before registration requirements could apply. I think that covers the set of circumstances just outlined by the noble Viscount.
The Minister spoke about universities. Did he mean the academics—any academic within the universities?
Yes.
Amendment 103 was tabled by the noble Lord, Lord Clement-Jones, to remove the exemption from the registration requirement in FIRS for lawyers providing legal activities. While I welcome the challenge, removing this exemption would risk undermining long-standing protections the UK has afforded to the provision of confidential legal advice and the equitable administration of justice. The exemption is available only to lawyers carrying out legal activity and so would not apply to other individuals carrying out legal activity.
I also reiterate what was said in Committee in the other place: that this exemption does not completely exempt legal professionals from engaging with the scheme. It does not cover all the activities that could be undertaken by a legal professional as part of an arrangement with a foreign principal. Activities that are not strictly legal activities, such as lobbying, for example, may still need to be registered. So, for example, if a lawyer were to enter into an arrangement with a foreign power to lobby a UK government Minister or parliamentarian on the UK’s foreign policy towards that foreign power, that would be registrable. The fact that the individual is a lawyer is not sufficient in and of itself to exempt them from registration.
I heard what the Minister said about lobbying and the additional aspect of lobbying by law firms, but why is any exemption needed beyond what is contained in Clause 74, which covers legal professional privilege effectively—legal proceedings and so on—so that no confidential information needs to be divulged? Why is it not necessary that a law firm is acting for a foreign power or an entity controlled by a foreign power? Why should that be exempt?
I think I explained this in reasonable detail. It goes back to the sort of work the lawyers carry out. As I say, it is the long-standing protections that the UK has afforded—
All the Minister is saying, in a highly circular way, is that it is in here because it has always been in here in some other forms of legislation. I do not think that is much of an answer.
In that case, I am very sorry to disappoint the noble Lord. I apologise for having spoken at such length.
(2 years ago)
Grand CommitteeMy Lords, it is a pleasure to follow three such excellent opening speeches. I draw attention to my interests in the register, particularly my interest in artificial intelligence technologies as a former chair of the AI Select Committee of this House. As a non-member of her committee, I congratulate my noble friend Lady Hamwee and the committee on such a comprehensive and well-argued report.
I entirely understand and welcome the width of the report but today I shall focus on live facial recognition technology, a subject that I have raised many times in this House and elsewhere in Questions and debates, and even in a Private Member’s Bill, over the last five years. The previous debate involving a Home Office Minister—the predecessor of the noble Lord, Lord Sharpe, the noble Baroness, Lady Williams—was in April, on the new College of Policing guidance on live facial recognition.
On each occasion, I drew attention to why guidance or codes are regarded as insufficient by myself and many other organisations such as Liberty, Big Brother Watch, the Ada Lovelace Institute, the former Information Commissioner, current and former Biometrics and Surveillance Camera Commissioners and the Home Office’s own Biometrics and Forensics Ethics Group, not to mention the Commons Science and Technology Committee. On each occasion, I have raised the lack of a legal basis for the use of this technology—and on each occasion, government Ministers have denied that new explicit legislation or regulation is needed, as they have in the wholly inadequate response to this report.
In the successful appeal of Liberal Democrat Councillor Ed Bridges, the Court of Appeal case on the police use of live facial recognition issued in August 2020, the court ruled that South Wales Police’s use of such technology had not been in accordance with the law on several grounds, including in relation to certain human rights convention rights, data protection legislation and the public sector equality duty. So it was with considerable pleasure that I read the Justice and Home Affairs Committee report, which noted the complicated institutional landscape around the adoption of this kind of technology, emphasised the need for public trust and recommended a stronger legal framework with primary legislation embodying general principles supported by detailed regulation, a single national regulatory body, minimum scientific standards, and local or regional ethics committees put on a statutory basis.
Despite what paragraph 4 of the response says, neither House of Parliament has ever adequately considered or rigorously scrutinised automated facial recognition technology. We remain in the precarious position of police forces dictating the debate, taking it firmly out of the hands of elected parliamentarians and instead—as with the recent College of Policing guidance—marking their own homework. A range of studies have shown that facial recognition technology disproportionately misidentifies women and BAME people, meaning that people from those groups are more likely to be wrongly stopped and questioned by police, and to have their images retained as the result of a false match.
The response urges us to be more positive about the use of new technology, but the UK is now the most camera-surveilled country in the Western world. London remains the third most surveilled city in the world, with 73 surveillance cameras for every 1,000 people. The last Surveillance Camera Commissioner did a survey, shortly before stepping down, and found that there are over 6,000 systems and 80,000 cameras in operation in England and Wales across 183 local authorities. The ubiquity of surveillance cameras, which can be retrofitted with facial recognition software and fed into police databases, means that there is already an apparatus in place for large-scale intrusive surveillance, which could easily be augmented by the widespread adoption of facial recognition technology. Indeed, many surveillance cameras in the UK already have advanced capabilities such as biometric identification, behavioural analysis, anomaly detection, item/clothing recognition, vehicle recognition and profiling.
The breadth of public concern around this issue is growing clearer by the day. Many cities in the US have banned the use of facial recognition, while the European Parliament has called for a ban on the police use of facial recognition technology in public places and predictive policing. In 2020 Microsoft, IBM and Amazon announced that they would cease selling facial recognition technology to US law enforcement bodies.
Public trust is crucial. Sadly, the new Data Protection and Digital Information Bill does not help. As the Surveillance Camera Commissioner said last year, in a blog about the consultation leading up to it:
“This consultation ought to have provided a rare opportunity to pause and consider the real issues that we talk about when we talk about accountable police use of biometrics and surveillance, a chance to design a legal framework that is a planned response to identified requirements rather than a retrospective reaction to highlighted shortcomings, but it is an opportunity missed.”
Now we see that the role of Surveillance Camera Commissioner is to be abolished in the new data protection Bill—talk about shooting the messenger. The much-respected Ada Lovelace Institute has called, in its report Countermeasures and the associated Ryder review in June this year, for new primary legislation to govern the use of biometric technologies by both public and private actors, for a new oversight body and for a moratorium until comprehensive legislation is passed.
The Justice and Home Affairs Committee stopped short of recommending a moratorium on the use of LFR, but I agree with the institute that a moratorium is a vital first step. We need to put a stop to this unregulated invasion of our privacy and have a careful review, so that its use can be paused while a proper regulatory framework is put in place. Rather than update and use toothless codes of practice, as we are urged to do by the Government, to legitimise the use of new technologies such as live facial recognition, the UK should have a root-and-branch surveillance camera and biometrics review, which seeks to increase accountability and protect fundamental rights. The committee’s report is extremely authoritative in this respect. I hope today that the Government will listen but, so far, I am not filled with optimism about their approach to AI governance.
(2 years, 1 month ago)
Lords ChamberTo ask His Majesty’s Government, further to reports that (1) at least 48 councils employ private companies to issue penalties for public spaces protection orders, and (2) many councils pay those companies per fine issued which incentivises companies to issue more penalties than may be necessary, what plans they have to introduce statutory guidance prohibiting this practice.
My Lords, it is for local authorities to determine how to operate the powers granted to them in legislation. Contracting enforcement to third parties is a common arrangement and it is for the local authority to ensure it is just. Contractors are bound by the same legal obligations and safeguards in legislation as the councils themselves.
My Lords, that is a classic dusty reply from the Home Office. What a contrast with Defra: its guidance on littering, which is a criminal offence, says that incentivising enforcement undermines
“the legitimacy of the enforcement regime”.
Wherever it has occurred, fining for profit has been associated with cases of injustice and now Defra is putting that in statutory guidance. Why is the Home Office not going to do this in its own guidance on the Anti-social Behaviour, Crime and Policing Act?
My Lords, I think it is worth reminding the House about public space protection orders, which are intended to deal with a particular nuisance or problem, in a specific area, that is detrimental to the local community’s quality of life by imposing conditions on the use of that area which apply to everyone. So the Home Office did publish statutory guidance to support local areas to make effective use of these powers. The guidance sets out the importance of focusing of the needs of the victim and the local community, as well as ensuring that the relevant legal tests are met. I repeat that it is for local authorities to determine how to enforce PSPOs and that can include the use of private contractors. Local authorities are obliged to follow the rules set out in the Public Contract Regulations 2015 in their appointment of such companies.
As I said earlier, the contracts that are awarded to these companies are governed by quite stringent guidance and rules. It is a matter for local authorities and the contracting companies.
My Lords, if Defra is able to do this, why can the Home Office not do it? Defra is also very close to local government and clearly regards this as the wrong thing for local councils to be doing. Why does the Home Office not regard it as the wrong thing for councils to be doing?
Well, the noble Lord has already asked me that and I think I have already answered. The Home Office has provided statutory guidance to support local areas to make effective use of these powers. I go back to my earlier answer: the local areas are obliged to follow the rules set out in the Public Contracts Regulations 2015 before appointing such companies.
(2 years, 7 months ago)
Lords ChamberMy Lords, I shall focus mainly on the Government’s digital proposals. As my noble friend Lady Bonham-Carter, the noble Baroness, Lady Merron, and many other noble Lords have made clear, the media Bill and Channel 4 privatisation will face fierce opposition all around this House. It could not be clearer that the policy towards both Channel 4 and the BBC follows some kind of red wall-driven, anti-woke government agenda that has zero logic. The Up Next White Paper on PSB talks of
“embedding the importance of distinctively British content directly into the existing quota system.”
How does the Minister define “distinctively British content”? Is it whatever the Secretary of State believes it is? As for the Government’s response to the consultation on audience protection standards on VOD services, can the Minister confirm that Ofcom will have the power to assess whether a platform’s own-brand age ratings genuinely take account of the values and expectations of UK families, as the BBFC’s do?
Having sat alongside the noble Lord, Lord Stevenson, on the joint scrutiny committee on the draft Online Safety Bill, I agreed with all his remarks today. I welcome the fact that its provisions are directed primarily at the business model of the social media platforms—in particular, the inclusion of scam advertising within the Bill and the inclusion of pornographic sites—but it is vital, if we are to have privacy protecting age verification, that principles for age assurance are included in the Bill. I welcome the intention to legislate for the new criminal communications offences recommended by the Law Commission, but without these being passed into law, the Bill will be completely defective, and we must incorporate the hate crime offences too.
But there are key issues that will need dealing with in the Bill’s passage through Parliament. As we have heard from many noble Lords, the “legal but harmful” provisions are potentially dangerous to freedom of expression, with those harms not being defined in the Bill itself. Similarly, with the lack of definition of children’s harms, it needs to be clear that encouraging self-harm or eating disorders is explicitly addressed on the face of the Bill, as my honourable friend Jamie Stone emphasised on Second Reading. My honourable friend Munira Wilson raised whether the metaverse was covered. Noble Lords may have watched the recent Channel 4 “Dispatches” exposing harms in the metaverse and chat rooms in particular. Without including it in the primary legislation, how can we be sure about this? In addition, the category definitions should be based more on risk than on reach, which would take account of cross-platform activity.
One of the great gaps not filled by the Bill, or the recent Elections Act just passed, is the whole area of misinformation and disinformation which gives rise to threats to our democracy. The Capitol riots of 6 January last year were a wake-up call, along with the danger of Donald Trump returning to Twitter.
The major question is why the draft digital markets, competition and consumer Bill is only a draft Bill in this Session. The DCMS Minister Chris Philp himself said in a letter to the noble Baroness, Lady Stowell—the Chair of the Communications and Digital Committee—dated just this 6 May, that
“urgent action in digital markets is needed to address the dominance of a small number of very powerful tech firms.”
In evidence to the BEIS Select Committee, the former chair of the CMA, the noble Lord, Lord Tyrie, recently stressed the importance of new powers to ensure expeditious execution and to impose interim measures.
Given the concerns shared widely within business about the potential impact on data adequacy with the EU, the idea of getting a Brexit dividend from major amendments to data protection through a data reform Bill is laughable. Maybe some clarification and simplification are needed—but not the wholesale changes canvassed in the Data: A New Direction consultation. Apart from digital ID standards, this is a far lower business priority than reforming competition regulation. A report by the New Economics Foundation made what it said was a “conservative estimate” that if the UK were to lose its adequacy status, it would increase business costs by at least £1.6 billion over the next 10 years. As the report’s author said, that is just the increased compliance costs and does not include estimates of the wider impacts around trade shifting, with UK businesses starting to lose EU customers. In particular, as regards issues relating to automated decision-making, citizens and consumers need more protection, not less.
As regards the Product Security and Telecommunications Infrastructure Bill, we see yet more changes to the Electronic Communications Code, all the result of the Government taking a piecemeal approach to broadband rollout. I do, however, welcome the provisions on security standards for connectable tech products.
Added to a massive programme of Bills, the DCMS has a number of other important issues to resolve: the AI governance White Paper; gambling reform, as mentioned by my noble friend Lord Foster; and much-needed input into IP and performers’ rights reform and protection where design and AI are concerned. I hope the Minister is up for a very long and strenuous haul. Have the Government not clearly bitten off more than the DCMS can chew?
(2 years, 8 months ago)
Lords ChamberTo ask Her Majesty’s Government what assessment they have made of the new College of Policing guidance on live facial recognition.
My Lords, facial recognition is an important public safety tool that helps the police to identify and eliminate suspects more quickly and accurately. The Government welcome the College of Policing’s national guidance, which responds to a recommendation in the Bridges v South Wales Police judgment.
My Lords, despite committing to a lawful, ethical approach, the guidance gives carte blanche to the use of live and retrospective facial recognition, potentially allowing innocent victims and witnesses to be swept on to police watch-lists. This is without any legislation or parliamentary or other oversight, such as that recently recommended by the Justice and Home Affairs Committee, chaired by my noble friend Lady Hamwee. Are we not now sleep-walking into a surveillance society, and is it not now time for a moratorium on this technology, pending a review?
I disagree with everything that the noble Lord has said. I think every police force in the country uses retrospective facial recognition. Watch-lists are deleted upon use at a deployment, so there is no issue regarding ongoing data protection. Importantly, just as CCTV and retrospective recognition are still used to detect criminals, missing persons and vulnerable people, so is the application of LFR.
(2 years, 10 months ago)
Lords ChamberThat this House regrets the Surveillance Camera Code of Practice because (1) it does not constitute a legitimate legal or ethical framework for the police’s use of facial recognition technology, and (2) it is incompatible with human rights requirements surrounding such technology.
Relevant document: 23rd Report from the Secondary Legislation Scrutiny Committee
My Lords, I have raised the subject of live facial recognition many times in this House and elsewhere, most recently last November, in connection with its deployment in schools. Following an incredibly brief consultation exercise, timed to coincide with the height of the summer holidays last year, the Government laid an updated Surveillance Camera Code of Practice, pursuant to the Protection of Freedoms Act 2012, before both Houses on 16 November last year, which came into effect on 12 January 2022.
The subject matter of this code is of great importance. The last Surveillance Camera Commissioner did a survey shortly before stepping down, and found that there are over 6,000 systems and 80,000 cameras in operation across 183 local authorities. The UK is now the most camera-surveilled country in the western world. According to recently published statistics, London remains the third most surveilled city in the world, with 73 surveillance cameras for every 1,000 people. We are also faced with a rising tide of the use of live facial recognition for surveillance purposes.
Let me briefly give a snapshot of the key arguments why this code is insufficient as a legitimate legal or ethical framework for the police’s use of facial recognition technology and is incompatible with human rights requirements surrounding such technology. The Home Office has explained that changes were made mainly to reflect developments since the code was first published, including changes introduced by legislation such as the Data Protection Act 2018 and those necessitated by the successful appeal of Councillor Ed Bridges in the Court of Appeal judgment on police use of live facial recognition issued in August 2020, which ruled that that South Wales Police’s use of AFR—automated facial recognition—had not in fact been in accordance with the law on several grounds, including in relation to certain convention rights, data protection legislation and the public sector equality duty.
During the fifth day in Committee on the Police, Crime, Sentencing and Courts Bill last November, the noble Baroness, Lady Williams of Trafford, the Minister, described those who know about the Bridges case as “geeks”. I am afraid that does not minimise its importance to those who want to see proper regulation of live facial recognition. In particular, the Court of Appeal held in Bridges that South Wales Police’s use of facial recognition constituted an unlawful breach of Article 8—the right to privacy—as it was not in accordance with law. Crucially, the Court of Appeal demanded that certain bare minimum safeguards were required for the question of lawfulness to even be considered.
The previous surveillance code of practice failed to provide such a basis. This, the updated version, still fails to meet the necessary standards, as the code allows wide discretion to individual police forces to develop their own policies in respect of facial recognition deployments, including the categories of people included on a watch-list and the criteria used to determine when to deploy. There are but four passing references to facial recognition in the code itself. This scant guidance cannot be considered a suitable regulatory framework for the use of facial recognition.
There is, in fact, no reference to facial recognition in the Protection of Freedoms Act 2012 itself or indeed in any other UK statute. There has been no proper democratic scrutiny over the code and there remains no explicit basis for the use of live facial recognition by police forces in the UK. The forthcoming College of Policing guidance will not satisfy that test either.
There are numerous other threats to human rights that the use of facial recognition technology poses. To the extent that it involves indiscriminately scanning, mapping and checking the identity of every person within the camera’s range—using their deeply sensitive biometric data—LFR is an enormous interference with the right to privacy under Article 8 of the ECHR. A “false match” occurs where someone is stopped following a facial recognition match but is not, in fact, the person included on the watch-list. In the event of a false match, a person attempting to go about their everyday life is subject to an invasive stop and may be required to show identification, account for themselves and even be searched under other police powers. These privacy concerns cannot be addressed by simply requiring the police to delete images captured of passers-by or by improving the accuracy of the technology.
The ECHR requires that any interference with the Article 10 right to freedom of expression or the Article 11 right to free association is in accordance with law and both necessary and proportionate. The use of facial recognition technology can be highly intimidating. If we know our faces are being scanned by police and that we are being monitored when using public spaces, we are more likely to change our behaviour and be influenced on where we go and who we choose to associate with.
Article 14 of the ECHR ensures that no one is denied their rights because of their gender, age, race, religion or beliefs, sexual orientation, disability or any other characteristic. Police use of facial recognition gives rise to two distinct discrimination issues: bias inherent in the technology itself and the use of the technology in a discriminatory way.
Liberty has raised concerns regarding the racial and socioeconomic dimensions of police trial deployments thus far—for example, at Notting Hill Carnival for two years running as well as twice in the London Borough of Newham. The disproportionate use of this technology in communities against which it “underperforms” —according to its proponent’s standards—is deeply concerning.
As regards inherent bias, a range of studies have shown facial recognition technology disproportionately misidentifies women and BAME people, meaning that people from these groups are more likely to be wrongly stopped and questioned by police and to have their images retained as the result of a false match.
The Court of Appeal determined that South Wales Police had failed to meet its public sector equality duty, which requires public bodies and others carrying out public functions to have due regard to the need to eliminate discrimination. The revised code not only fails to provide any practical guidance on the public sector equality duty but, given the inherent bias within facial recognition technology, it also fails to emphasise the rigorous analysis and testing required by the public sector equality duty.
The code itself does not cover anybody other than police and local authorities, in particular Transport for London, central government and private users where there have also been concerning developments in terms of their use of police data. For example, it was revealed that the Trafford Centre in Manchester scanned the faces of every visitor for a six-month period in 2018, using watch-lists provided by Greater Manchester Police—approximately 15 million people. LFR was also used at the privately owned but publicly accessible site around King’s Cross station. Both the Met and British Transport Police had provided images for their use, despite originally denying doing so.
It is clear from the current and potential future human rights impact of facial recognition that this technology has no place on our streets. In a recent opinion, the former Information Commissioner took the view that South Wales Police had not ensured that a fair balance had been struck between the strict necessity of the processing of sensitive data and the rights of individuals.
The breadth of public concern around this issue is growing clearer by the day. Several major cities in the US have banned the use of facial recognition and the European Parliament has called for a ban on police use of facial recognition technology in public places and predictive policing. In response to the Black Lives Matter uprisings in 2020, Microsoft, IBM and Amazon announced that they would cease selling facial recognition technology to US law enforcement bodies. Facebook, aka Meta, also recently announced that it will be shutting down its facial recognition system and deleting the “face prints” of more than a billion people after concerns were raised about the technology.
In summary, it is clear that the Surveillance Camera Code of Practice is an entirely unsuitable framework to address the serious rights risk posed by the use of live facial recognition in public spaces in the UK. As I said in November in the debate on facial recognition technology in schools, the expansion of such tools is a
“short cut to a widespread surveillance state.”—[Official Report, 4/11/21; col. 1404.]
Public trust is crucial. As the Biometrics and Surveillance Camera Commissioner said in a recent blog:
“What we talk about in the end, is how people will need to be able to have trust and confidence in the whole ecosystem of biometrics and surveillance”.
I have on previous occasions, not least through a Private Member’s Bill, called for a moratorium on the use of LFR. In July 2019, the House of Commons Science and Technology Committee published a report entitled The Work of the Biometrics Commissioner and the Forensic Science Regulator. It repeated a call made in an earlier 2018 report that
“automatic facial recognition should not be deployed until concerns over the technology’s effectiveness and potential bias have been fully resolved.”
The much-respected Ada Lovelace Institute has also called for a
“a voluntary moratorium by all those selling and using facial recognition technology”,
which would
“enable a more informed conversation with the public about limitations and appropriate safeguards.”
Rather than update toothless codes of practice to legitimise the use of new technologies like live facial recognition, the UK should have a root and branch surveillance camera review which seeks to increase accountability and protect fundamental rights. The review should investigate the novel rights impacts of these technologies, the scale of surveillance we live under and the regulations and interventions needed to uphold our rights.
We were reminded by the leader of the Opposition on Monday about what Margaret Thatcher said, and I also said this to the Minister earlier this week:
“The first duty of Government is to uphold the law. If it tries to bob and weave and duck around that duty when it’s inconvenient, if Government does that, then so will the governed and then nothing is safe—not home, not liberty, not life itself.”
It is as apposite for this debate as it was for that debate on the immigration data exemption. Is not the Home Office bobbing and weaving and ducking precisely as described by the late Lady Thatcher?
My Lords, the noble Lord, Lord Clement-Jones, has given an eloquent exposition of the reasons for supporting his Motion of Regret. The Motion refers to the ethical and human rights considerations that attach to the use of surveillance camera technology, and it is to those two considerations that I shall address my remarks. I especially draw the Minister’s attention to the Amnesty International report of 3 June 2021 about the use of surveillance technology in New York, to which the noble Lord referred, and also to the serious civil liberty questions that that report raised. Concerns were raised in Japan on 28 December, in Yomiuri Shimbun, and in the Financial Times on 10 June, about Chinese technology in Belgrade, and on the Asia News Monitor in November 2021 in a report from Thailand about mass surveillance against Uighurs in Xinjiang, as well as a report in the Telegraph of 1 December, in which the head of MI6, Richard Moore, said that
“technologies of control … are increasingly being exported to other governments by China—expanding the web of authoritarian control around the planet”.
It is not just control—it is also a keystone in the export of truly shocking crimes against humanity and even genocide. Just a week ago, we marked Holocaust Memorial Day, on which many colleagues from across the House signed the Holocaust Memorial Day book or issued statements recommitting to never allowing such a genocide to happen ever again. Yet, sadly, in 2022, as the Foreign Secretary has said, a genocide against the Uighur Muslims is taking place in Xinjiang. As I argued in our debate on Monday, we are doing far too little to sanction those companies that are actively involved, or to regulate and restrict the facial recognition software that has allowed the Chinese state to incarcerate and enslave more than a million Uighurs.
In the 1940s, we did not allow the widespread use of IBM’s machines, or other tools of genocide used in Nazi Germany and manufactured by slave labour in factories and concentration camps, to be sold in the United Kingdom. Today we find ourselves in the perverse situation of having Chinese surveillance cameras with facial recognition software being used in government departments, hospitals, schools and local councils as well as in shops, such as Tesco and Starbucks. It is an issue that I doggedly raised during our debates on the telecommunications Bills that have recently been before your Lordships’ House. As I said in those debates, a series of freedom of information requests in February 2021 found that more than 70% of local councils use surveillance cameras and software from either Dahua Technology or Hikvision, which are companies rightly subject to United States sanctions for their involvement in the development and installation of technology and software that targets Uighur Muslims. Nevertheless, these companies are free to operate in the United Kingdom.
So much for co-ordinating our response with our Five Eyes allies, which was the subject of one amendment that I laid before your Lordships’ House. Far from being a reputable or independent private company, more than 42% of Hikvision is owned by Chinese state-controlled enterprises. According to Hikvision’s accounts, for the first half of 2021, the company received RMB 223 million in state subsidies, while the company works hand in glove with the authorities in Xinjiang, having signed five public-private partnerships with them since 2017. What is perhaps just as disturbing are the recent reports in the Mail on Sunday that Hikvision received up to £10,000 per month of furlough money from United Kingdom taxpayers from December 2020 until February 2021. How can it be right that, at a time when the US Government are sanctioning Hikvision for its links to Uighur concentration camps, the UK Government are giving them taxpayer money and Covid furlough funds?
It is clear that the introduction and use of this type of facial recognition software technology by the police needs substantial regulation and oversight, especially because of the dominance of sanctioned Chinese companies in the UK surveillance market. Hikvision alone has nearly 20% of the global surveillance camera market. Hikvision is working hard to penetrate and dominate the UK surveillance technology sector. In May 2021, it launched a consultant support programme and demonstration vehicles so it could bring its technology
“to all parts of the United Kingdom”.
In October, it became corporate partner in the Security Institute, the UK’s largest membership body for security professionals, and it has launched a dedicated UK technology partner programme. All of this deserves further investigation by our domestic intelligence services.
I referenced this without mentioning the company’s name. I recognise the seriousness of the issue and I will take the point back.
I have had a note to say that it is at constable level, but of course they are accountable to the PCC.
My Lords, I thank the Minister for her comprehensive reply. This has been a short but very focused debate and full of extraordinary experience from around the House. I am extremely grateful to noble Lords for coming and contributing to this debate in the expert way they have.
Some phrases rest in the mind. The noble Lord, Lord Alton, talked about live facial recognition being the tactic of authoritarian regimes, and there are several unanswered questions about Hikvision in particular that he has raised. The noble Lord, Lord Anderson, talked about the police needing democratic licence to operate, which was also the thrust of what the noble Lord, Lord Rosser, has been raising. It was also very telling that the noble Lord, Lord Anderson, said the IPA code was much more comprehensive than this code. That is somewhat extraordinary, given the subject matter of the IPA code. The mantra of not stifling innovation seems to cut across every form of government regulation at the moment. The fact is that, quite often, certainty in regulation can actually boost innovation— I think that is completely lost on this Government.
The noble Baroness, Lady Falkner, talked about human rights being in a parlous state, and I appreciated her remarks—both in a personal capacity and as chair of the Equality and Human Rights Commission—about the public sector equality duty and what is required, and the fact that human rights need to be embedded in the regulation of live facial recognition.
Of course, not all speakers would go as far as I would in asking for a moratorium while we have a review. However, all speakers would go as far as I go in requiring a review. I thought the adumbration by the noble Lord, Lord Rosser, of the elements of a review of that kind was extremely useful.
The Minister spent some time extolling the technology —its accuracy and freedom from bias and so on—but in a sense that is a secondary issue. Of course it is important, but the underpinning of this by a proper legal framework is crucial. Telling us all to wait until we see the College of Policing guidance does not really seem satisfactory. The aspect underlying everything we have all said is that this is piecemeal—it is a patchwork of legislation. You take a little bit from equalities legislation, a little bit from the Data Protection Act, a little bit to come—we know not what—from the College of Policing guidance. None of that is satisfactory. Do we all just have to wait around until the next round of judicial review and the next case against the police demonstrate that the current framework is not adequate?
Of course I will not put this to a vote. This debate was to put down a marker—another marker. The Government cannot be in any doubt at all that there is considerable anxiety and concern about the use of this technology, but this seems to be the modus operandi of the Home Office: do the minimum as required by a court case, argue that it is entirely compliant when it is not and keep blundering on. This is obviously light relief for the Minister compared with the police Bill and the Nationality and Borders Bill, so I will not torture her any further. However, I hope she takes this back to the Home Office and that we come up with a much more satisfactory framework than we have currently.