Read Bill Ministerial Extracts
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Holmes of Richmond
Main Page: Lord Holmes of Richmond (Conservative - Life peer)Department Debates - View all Lord Holmes of Richmond's debates with the Department for Science, Innovation & Technology
(11 months, 1 week ago)
Lords ChamberMy Lords, it is a pleasure to take part on Second Reading; I declare my interests in financial services and technology, in Ecospend Ltd and Boston Ltd. There is a fundamental truth at the heart of our deliberations, both on Second Reading and as we progress to Committee: that is it is our data. There are no great large language models; perhaps it would be more appropriate to call them large data models —maybe then they would be more easily and quickly understood by more people. Ultimately, our data is going into AI for potentially positive and transformational purposes but only if there is consent, understanding, trustworthiness and a real connection between the purpose to which the AI is being put and those of us whose data is being put into the AI.
I am going to focus on four areas: one is data adequacy, which has already, understandably, been heavily mentioned; then AI, smart data and digital ID. I can probably compress everything I was going to say on the first subject by simply asking my noble friend the Minister: how will the Bill assure adequacy between the UK and the EU? It is quite a large Bill— as other noble Lords have commented—yet it still has a number of gaps that I am sure we will all be keen to fully fill in when we return in 2024. As already mentioned, AI is nothing without data, so what checks are being put in place for many of the suggestions throughout the Bill where AI is used to interrogate individuals’ data? Would it not be absolutely appropriate for there to be effective, clear, transparent labelling across all AI uses, not least in the public sector but across all public and private sector uses? Saying this almost feels like going off track from the Bill into AI considerations, but it seems impossible to consider the Bill without seeing how it is inextricably linked to AI and the pro-innovation AI White Paper published earlier this year. Does the Minister not agree? How much line-by-line analysis has been done of the Bill to ensure that there is coherence across the Government’s ambitions for AI and what is currently set out in this Bill?
On smart data, there are clearly extraordinary opportunities but they are not inevitabilities. To consider just one sector, the energy sector, to be able potentially to deploy customers’ data in real time—through their smart meters, for example—with potential to auto-shift in real time to the cheapest tariff, could be extraordinarily positive. But again, that is only if there is an understanding of how the consent mechanisms will work and how each citizen is enabled to understand that it is their data. There are potentially huge opportunities, not least to do something significant about the poverty premium, where all too often those who find themselves with the least are forced to pay the most, often for essential services such as energy. What are the Government doing in terms of looking at additional sectors for smart data deployment? What areas are the state activities? What areas of previous state activity are being considered for the deployment of smart data? What stage is that analysis at?
On digital ID, about which I have spoken a lot over previous years, again there are huge opportunities and possibilities. I welcome what is in the Bill around the potential use of digital ID in property transactions. This could be an extraordinarily positive development. What other areas are being looked at for potential digital ID usage? What stage is that analysis at? Also, is what is set out in the Bill coherent with other government work in other departments on digital ID? It seems that a lot has been done and there have been a number of efforts from various Administrations on digital ID, but we are yet to realise the prize it could bring.
I will ask my noble friend some questions in conclusion. First, how will the introduction of the SRI improve things compared with the data protection officer? Again, how will that impact on issues such as, but not limited to, adequacy? Similarly, linking back to artificial intelligence, a key principle—though not foolproof by any measure and certainly not a silver bullet, but important none the less—is the human in the loop. The Bill is currently some way short of a clear, effective definition and exposition of how meaningful human intervention, human involvement and human oversight will work where autonomous systems are at play. What are the Government’s plans to address that significant gap in the Bill as currently drafted?
I end where I began, with the simple truth that it is our data. Data has been described in various terms, not least as the new oil, but that definition gets us nowhere. It is so much more profound than that. Ultimately it is part of us and, when it is put together in combination, it gets so close to giving such a detailed, personal and almost complete picture of us—ultimately the digital twin, if you will. Are the Government content that the Bill does everything to respect and fully understand the need for everything to be seen as trustworthy, to be understood in terms of it being our data and our decision, and that we decide what data to deploy, for what purpose, to whom and for what time period? It is our data.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Holmes of Richmond
Main Page: Lord Holmes of Richmond (Conservative - Life peer)Department Debates - View all Lord Holmes of Richmond's debates with the Department for Science, Innovation & Technology
(7 months, 1 week ago)
Grand CommitteeMy Lords, it is a pleasure to take part in today’s Committee proceedings. I declare my technology interests as an adviser to Boston Limited. It is self-evident that we have been talking about data but there could barely be a more significant piece of data than biometrics. In moving the amendment, I shall speak also to Amendments 197B and 197C, and give more than a nod to the other amendments in this group.
When we talk about data, it is always critical that we remember that it is largely our data. There could be no greater example of that than biometrics. More than data, they are parts and fragments of our very being. This is an opportune moment in the debate on the Bill to strengthen the approach to the treatment and the use of biometrics, not least because they are being increasingly used by private entities. That is what Amendments 197A to 197C are all about—the establishment of a biometrics office, a code of practice and oversight, and sanctions and fines to boot. This is of that level of significance. The Bill should have that strength when we are looking at such a significant part of our very human being and data protection.
Amendment 197B looks at reporting and regulatory requirements, and Amendment 197C at the case for entities that have already acted in the biometrics space prior to the passage of the Bill. In short, it is very simple. The amendments take principles that run through many elements of data protection and ensure that we have a clear statement on the use and deployment of biometrics in the Bill. There could be no more significant pieces of data. I look forward to the Minister’s response. I thank the Ada Lovelace Institute for its help in drafting the amendments, and I look forward to the debate on this group. I beg to move.
My Lords, I have added my name in support of the stand part notices of the noble Lord, Lord Clement-Jones, to Clauses 147, 148 and 149. These clauses would abolish the office of the Biometrics and Surveillance Camera Commissioner, along with the surveillance camera code of practice. I am going to speak mainly to the surveillance camera aspect, although I was taken by the speech of the noble Lord, Lord Holmes, who made some strong points.
The UK has become one of the most surveilled countries in the democratic world. There are estimated to be over 7 million CCTV cameras in operation. I give one example: the automated number plate recognition, ANPR, system records between 70 million and 80 million readings every day. Every car is recorded on average about three times a day. The data is held for two years. The previous Surveillance Camera Commissioner, Tony Porter, said about ANPR that it,
“must surely be one of the largest data gatherers of its citizens in the world. Mining of meta-data—overlaying against other databases can be far more intrusive than communication intercept”.
Professor Sampson, the previous commissioner, said about ANPR:
“There is no ANPR legislation or act, if you like. And similarly, there is no governance body to whom you can go to ask proper questions about the extent and its proliferation, about whether it should ever be expanded to include capture of other information such as telephone data being emitted by a vehicle or how it's going to deal with the arrival of automated autonomous vehicles”.
And when it came to independent oversight and accountability, he said:
“I’m the closest thing it’s got—and that’s nothing like enough”.
I am not against the use of surveillance cameras per se—it is unarguable that they are a valuable tool in the prevention and detection of crime—but there is clearly a balance to be found. If we chose to watch everything every person does all of the time, we could eliminate crime completely, but nobody is going to argue that to be desirable. We can clearly see how surveillance and biometrics can be misused by states that wish to control their populations—just look at China. So there is a balance to find between the protection of the public and intrusion into privacy.
Technology is moving incredibly rapidly, particularly with the ever-increasing capabilities of Al. As technology changes, so that balance between protection and privacy may also need to change. Yet Clause 148 will abolish the only real safeguards we have, and the only governance body that keeps an eye on that balance. This debate is not about where that balance ought to be; it is about making sure that there is some process to ensure that the balance is kept under independent review at a time when surveillance technologies and usage are developing incredibly rapidly.
I am sure that the Minister is going to argue that, as he said at Second Reading:
“Abolishing the Surveillance Camera Commissioner will not reduce data protection”.—[Official Report, 19/12/23; col. 2216.]
He is no doubt going to tell us that the roles of the commissioner will be adequately covered by the ICO. To be honest that completely misses the point. Surveillance is not just a question of data protection; it is a much wider question of privacy. Yes, the ICO may be able to manage the pure data protection matters, but it cannot possibly be the right body to keep the whole question of surveillance and privacy intrusion, and the related technologies, under independent review.
It is also not true that all the roles of the commissioner are being transferred to other bodies. The report by the Centre for Research into Surveillance and Privacy, or CRISP, commissioned by the outgoing commissioner, is very clear that a number of important areas will be lost, particularly reviewing the police handling of DNA samples, DNA profiles and fingerprints; maintaining an up-to-date surveillance camera code of practice with standards and guidance for practitioners and encouraging compliance with that code; setting out technical and governance matters for most public body surveillance systems, including how to approach evolving technology, such as Al-driven systems including facial recognition technology; and providing guidance on technical and procurement matters to ensure that future surveillance systems are of the right standard and purchased from reliable suppliers. It is worth noting that it was the Surveillance Camera Commissioner who raised the issues around the use of Hikvision cameras, for example—not something that the ICO is likely to be able to do. Finally, we will also lose the commissioner providing reports to the Home Secretary and Parliament about public surveillance and biometrics matters.
Professor Sampson said, before he ended his time in office as commissioner:
“The lack of attention being paid to these important matters at such a crucial time is shocking, and the destruction of the surveillance camera code that we’ve all been using successfully for over a decade is tantamount to vandalism”.
He went on to say:
“It is the only legal instrument we have in this country that specifically governs public space surveillance. It is widely respected by the police, local authorities and the surveillance industry in general … It seems absolutely senseless to destroy it now”.
The security industry does not want to see these changes either, as it sees the benefits of having a clear code. The Security Systems and Alarms Inspection Board, said:
“Without the Surveillance Camera Commissioner you will go back to the old days when it was like the ‘wild west’, which means you can do anything with surveillance cameras so long as you don’t annoy the Information Commissioner … so, there will not be anyone looking at new emerging technologies, looking at their technical requirements or impacts, no one thinking about ethical implications for emerging technologies like face-recognition, it will be a free-for-all”.
The British Security Industry Association said:
“We are both disappointed and concerned about the proposed abolition of the B&SCC. Given the prolific emergence of biometric technologies associated with video surveillance, now is a crucial time for government, industry, and the independent commissioner(s) to work close together to ensure video surveillance is used appropriately, proportionately, and most important, ethically”.
I do not think I can put it better than that.
While there may be better ways to achieve the appropriate safeguards than the current commissioner arrangement, this Bill simply abolishes everything that we have now and replaces the safeguards only partially, and only from a data protection perspective. I am open to discussion about how we might fill the gaps, but the abolition currently proposed by the Bill is a massively retrograde and even dangerous step, removing the only safeguards we have against the uncontrolled creep towards ever more intrusive surveillance of innocent people. As technology increases the scope for surveillance, this must be the time for greater safeguards and more independent oversight, not less. The abolition of the commissioner and code should not happen unless there are clear, better, safeguards established to replace it, and this Bill simply does not do that.
My Lords, I thank all noble Lords who participated in the excellent debate on this set of amendments. I also thank my noble friend the Minister for part of his response; he furiously agreed with at least a substantial part of my amendments, even though he may not have appreciated it at the time. I look forward to some fruitful and positive discussions on some of those elements between Committee and Report.
When a Bill passes into statute, a Minister and the Government may wish for a number of things in terms of how it is seen and described. One thing that I do not imagine is on the list is for it to be said that this statute generates significant gaps—those words were put perfectly by the noble Viscount, Lord Stansgate. That it generates significant gaps is certainly the current position. I hope that we have conversations between Committee and Report to address at least some of those gaps and restate some of the positions that exist, before the Bill passes. That would be positive for individuals, citizens and the whole of the country. For the moment, I beg leave to withdraw my amendment and look forward to those subsequent conversations.