Crime and Policing Bill

Debate between Lord Holmes of Richmond and Lord Blencathra
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, it is a pleasure to speak in this Committee and to follow my friend the noble Lord, Lord Clement-Jones, who perfectly and proportionately set out the principles in this amendment, which I support to every last sentence. We are now discussing a number of amendments on areas where the existing law, and this Bill as drafted, are clearly out of date and full of gaps—not least when we consider how our nation, our economy and the state itself are seeking to move to digitisation, which has such benefits for citizens and communities, our cities and our entire country. But one key element which enables, empowers and underpins almost every element of that digital transformation is effective digital ID.

There are a number of arguments that could be made at another time about the correct approach to digital ID. I would suggest that the principles around self-sovereign ID should strongly be considered. Mandation is clearly problematic, while the reasons for introducing a digital ID should be clearly made and the benefits set out. But the specifics of this amendment are clear, proportionate and timely, because a digital ID is critical and essential to availing oneself of the opportunities—and, indeed, to protecting oneself against many of the harms. To not have a digital ID protected by the criminal law would be a huge, inexplicable and indefensible gap.

If the Government want digital ID to be the means of accessing government services and to see greater digital inclusion—and, through that, the attendant and very necessary financial inclusion—action to protect our digital ID is critical. The noble Lord, Lord Clement-Jones effectively set out his amendment, which is proportionate, valid, timely and necessary. I very much look forward to the Minister accepting the principle as set out.

Lord Blencathra Portrait Lord Blencathra (Con)
- View Speech - Hansard - - - Excerpts

My Lords, identity theft, as my noble friend Lord Holmes of Richmond said, is no longer a niche crime; it is the dominant fraud type in the UK and getting worse. In 2024, over 421,000 fraud cases were filed to the national fraud database and almost 250,000 were identity fraud filings, making identity theft the single largest category recorded by industry partners. CIFAS, the credit industry fraud avoidance system, recorded a record number of cases on the national fraud database in 2024. The organisations themselves prevented more than £2.1 billion of attempted loss, yet criminals are shifting tactics. Account takeovers rose by 76% and unauthorised SIM swaps surged, driven by the rapid adoption of AI and generative tools that let fraudsters create convincing fake documents and synthetic identities at scale.

We have all read of some of the high-profile examples: celebrity impersonation via deepfakes and cloned voices has been widely reported; manipulated videos and voice clones purporting to show public figures from Elon Musk to Martin Lewis, Holly Willoughby and others, have been used to generate investment scams and phishing campaigns. Documented victim losses include large individual losses linked to celebrity impersonation scams. One NatWest customer is reported to have lost £150,000 after responding to a scam impersonating Martin Lewis.

However, I think we are all more concerned with the tens of thousands of ordinary people who are not celebrities and who lose all their savings to these crooks. They are the victims who suffer real financial loss and damage, with long and costly recovery processes, while businesses face rising prevention costs and operational strain. I therefore strongly support the concept of the draft clause and the need for it. While it is well intentioned, I fear that it has some technical difficulties. It is a bit broad and vague about what “obtains” and “impersonate” mean. It also risks overlap with the Fraud Act, the Computer Misuse Act and the Data Protection Act, and lacks some clear defences for legitimate security research and lawful investigations. It also needs to address AI and the deepfake-specific methods, and set out what we can do about extraterritorial reach, for example, or aggravating factors for organised, large-scale operations.

We all know that my noble friend Lord Holmes of Richmond is, as we have just heard, an absolute expert on AI; he recently addressed a top-level group of the Council of Europe on this subject. May I suggest that he and the noble Lord, Lord Clement-Jones, get together with the Home Office or other government digital experts and bring back on Report a more tightly drafted amendment? Among other things, it should tighten the definitions of “obtain”, “impersonate” and “sensitive”; ensure that the mens rea is tied to dishonesty or intent to cause loss or gain; include recklessness in enabling others; limit the scope to unlawfully obtained data or use that bypasses authentication; and explicitly include AI/deepfake methods when used to bypass checks or cause reliance. It should also have clear defences for lawful authority and make sure that duplication is avoided, whether it be with the Fraud Act, the Computer Misuse Act or the Data Protection Act. Finally—I know this is an impossible ask, and that Governments find it almost impossible to do—something should be done about extraterritorial reach, because that is terribly important.

I say to the Minister: there is a gap in the legislation here. We should plug it, and we may have time to bring back on Report a more tightly drawn amendment that would deal with all the concerns of noble Lords and the possible problems I have just raised.