Read Bill Ministerial Extracts
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Science, Innovation & Technology
(1 year ago)
Lords ChamberMy Lords, I thank the Minister for his introduction to the Bill today and congratulate the noble Lord, Lord de Clifford, on his maiden speech. I think we all very much appreciated his valuable perspective on SMEs having to grapple with the intricacies of data protection. I very much look forward to his contributions—perhaps in Committee, if he feels brave enough.
The Minister will have heard the concerns expressed throughout the House—not a single speaker failed to express concerns about the contents of the Bill. The right reverend Prelate the Bishop of Southwell and Nottingham reminded us that the retention and enhancement of public trust in data use and sharing is of key importance, but so much of the Bill seems almost entirely motivated by the Government’s desire to be divergent from the EU to get some kind of Brexit dividend.
As we have heard from all around the House, the Bill dilutes where it should strengthen the rights of data subjects. We can then all agree on the benefits of data sharing without the risks involved. The Equality and Human Rights Commission is clearly of that view, alongside numerous others, such as the Ada Lovelace Institute and as many as 26 privacy advocacy groups. Even on the Government’s own estimates, the Bill will have a minimal positive impact on compliance costs—in fact, it will simply lead to companies doing business in Europe having to comply with two sets of regulations.
I will be specific. The noble Lord, Lord Davies of Brixton, set out the catalogue, and I will go through a number of areas where I believe those rights are being diluted. The amended and more subjective definition of “personal data” will narrow the scope of what is considered personal data, as the right reverend Prelate the Bishop of St Albans pointed out. Schedule 1 sets out a new annexe to the GDPR, with the types of processing activities that the Government have determined have a recognised legitimate interest and will not require a legitimate interest human rights balancing test to be carried out. Future Secretaries of State can amend or add to this list of recognised legitimate interests through secondary legislation. As a result, as the noble Baroness, Lady Bennett, pointed out, it will become easier for political parties to target children as young as 14 during election campaigns, even though they cannot vote until they are 16 or 18, depending on the jurisdiction.
The Bill will change the threshold for refusing a subject access request, which will widen the grounds on which an organisation could refuse requests. The noble Lord, Lord Sikka, reminded us of the existing difficulties of making those subject access requests. Clause 12, added on Report in the Commons, further tips power away from the individual’s ability to access data.
There are also changes to the automated decision-making provisions under Article 22 of the GDPR—the noble Lord, Lord Holmes, reminded us of the importance of the human in the loop. The Bill replaces Article 22 with articles that reduce human review of automated decision-making. As the noble Lord, Lord Knight, pointed out, Article 22 should in fact be strengthened so that it applies to partly automated processing as well, and it should give rights to people affected by an automated decision, not just those who provide data. This should be the case especially in the workplace. A decision about you may be determined by data about other people whom you may never have met.
The Bill amends the circumstances in which personal datasets can be reused for research purposes. New clarifying guidance would have been sufficient, but for-profit commercial research is now included. As the noble Lords, Lord Knight and Lord Davies, pointed out and as we discussed in debates on the then Online Safety Bill, the Bill does nothing where it really matters: on public interest researcher access.
The Bill moves away from UK GDPR requirements for mandatory data protection officers, and it also removes the requirement for data protection impact assessments. All this simply sets up a potential dual compliance system with less assurance—with what benefit? Under the new Bill, a controller or processor will be exempt from the duty to keep records, unless they are carrying out high-risk processing activities. But how effective will this be? One of the main ways of demonstrating compliance with GDPR is to have a record of processing activities.
There are also changes to the Information Commissioner’s role. We are all concerned about whether the creation of a new board will enable the ICO to maintain its current level of independence for data adequacy purposes. This is so important, as the noble Baroness, Lady Young, and my noble friend Lord McNally pointed out.
As regards intragroup transfers, there is concern from the National Aids Trust that Clause 5, permitting the intragroup transmission of personal health data
“where that is necessary for … administrative purposes”,
could mean that HIV/AIDS status is inadequately protected in workplace settings.
Schedule 5 to the Bill amends Chapter 5 of the UK GDPR to reform the UK’s regime for international transfers, with potential adverse consequences for business. The noble Lord, Lord Kirkhope, reminded us of the dangers of adopting too low standards internationally. This clearly has the potential to provide less protection for data subjects than the current test.
In Clause 17, the Bill removes a key enabler of collective interests, consultation with those affected by data and processing during the data protection risk assessment process, and it fails to provide alternative opportunities. Then there is the removal of the legal obligation to appoint a representative. This risks data breaches not being reported, takes away a channel of communication used by the ICO to facilitate its investigations, and increases the frustration of UK businesses in dealing with overseas companies that come to the UK market underprepared to comply with the UK GDPR.
Given that catalogue, it is hardly surprising that so many noble Lords have raised the issue of data adequacy. If I read out the list of all the noble Lords who have mentioned it, I would probably mention almost every single speaker in this debate. It is clear that the Bill significantly lowers data protection standards in the UK, as compared with the EU. On these Benches, our view is that this will undermine the basis of the UK’s EU data adequacy. The essential equivalence between the UK and the EU regimes has been critical to business continuity following Brexit. The Government’s own impact assessment acknowledges that, as the UK diverges from the EU GDPR, the risk of the EU revoking its adequacy decisions will increase. So I very much hope that the Minister, in response to all the questions he has been asked about data adequacy, has some pretty good answers, because there is certainly a considerable degree of concern around the House about the future of data adequacy.
In addition, there are aspects of the Bill that are just plain wrong. The Government need to deliver in full on their commitments to bereaved families made during the passage of what became the Online Safety Act, regarding access to their children’s data, as we have heard today from across the House, notably from the noble Baroness, Lady Kidron, in insisting that this is extended to all deaths of children. I very much hope that the Minister will harden up on his assurances at the end of the debate.
The noble Lords, Lord Kamall and Lord Vaux, questioned the abolition of the Surveillance Camera Commissioner, and the diminution of the duties relating to biometric data. Society is witnessing an unprecedented acceleration in the capability and reach of surveillance technologies, particularly live facial recognition, and we need the commissioner and Surveillance Camera Code of Practice in place. As the Ada Lovelace Institute says in its report Countermeasures, we need new and more comprehensive legislation on the use of biometrics, and the Equality and Human Rights Commission agrees with that too.
As regards what the noble Lord, Lord Sikka, described as unrestrained financial powers, inserted at Commons Report stage, Sir Stephen Timms MP, chair of the DWP Select Committee, very rightly expressed strong concerns about this, as did many noble Lords today, including the noble Baroness, Lady Young, and the noble Lords, Lord Knight and Lord Fox. These powers are entirely disproportionate and we will be strongly opposing them.
Then we have the new national security certificates and designation notices, which were mentioned by the right reverend Prelate the Bishop of St Albans. These would give the Home Secretary great and unaccountable powers to authorise the police to violate our privacy rights, through the use of national security certificates and designation notices, without challenge. The Government have failed to explain why they believe these clauses are necessary to safeguard national security.
There is a whole series of missed opportunities during the course of the Bill. As the noble Lord, Lord Knight, said in his opening speech, the Bill was an opportunity to create ethical, transparent and safe standards for AI systems. A number of noble Lords across the House, including the noble Lord, Lord Kamall, the noble Baroness, Lady Young, the right reverend Prelate the Bishop of Southwell and Nottingham, and my noble friend Lord McNally, all said that this is a wasted opportunity to create measures adequate to an era of ubiquitous use of data through AI systems. The noble Baroness, Lady Kidron, in particular talked about this in relation to children, generative AI and educational technology. The noble Lord, Lord Holmes, talked of this in the public sector, where it is so important as well.
The EU has just agreed in principle to a new AI Act. We are miles behind the curve. Then, of course, we have the new identification verification framework. The UK has chosen not to allow private sector digital ID systems to be used for access. Perhaps the Government could explain why that is the case.
There are a number of other areas, such as new models of personal data control, which were advocated as long ago as 2017, with the Hall-Pesenti review. Why are the Government not being more imaginative in that sense? There is also the avoidance of creating a new offence of identity theft. That seems to be a great missed opportunity in this Bill.
As the noble Baroness, Lady Kidron, mentioned, there is the question of holding AI system providers to be legally accountable for the generation of child sexual abuse material online by using their datasets. My noble friend Lord McNally and the noble Lord, Lord Kamall, raised the case of ICO v Experian. Why are the Government not taking the opportunity to correct that case?
In the face of the need to do more to protect citizens’ rights, this Bill is a dangerous distraction. It waters down rights, it is a huge risk to data adequacy, it is wrong in many areas and it is a great missed opportunity in many others. We on these Benches will oppose a Bill which appears to have very few friends around the House. We want to amend a great many of the provisions of the Bill and we want to scrutinise many other aspects of it where the amendments came through at a very late stage. I am afraid the Government should expect this Bill to have a pretty rough passage.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Science, Innovation & Technology
(9 months, 1 week ago)
Grand CommitteeMy Lords, we are beginning rather a long journey—at least, it feels a bit like that. I will speak to Amendments 1, 5 and 288, and the Clause 1 stand part notice.
I will give a little context about Clause 1. In a recent speech, the Secretary of State said something that Julia Lopez repeated this morning at a conference I was at:
“The Data Bill that I am currently steering through Parliament with my wonderful team of ministers”—
I invite the Minister to take a bow—
“is just one step in the making of this a reality—on its own it will add £10 billion to our economy and most crucially—we designed it so that the greatest benefit would be felt by small businesses across our country. Cashing in on a Brexit opportunity that only we were prepared to take, and now those rewards are going to be felt by the next generation of founders and business owners in local communities”.
In contrast, a coalition of 25 civil society organisations wrote to the Secretary of State, calling for the Bill to be dropped. The signatories included trade unions as well as human rights, healthcare, racial justice and other organisations. On these Benches, we share the concerns about the government proposals. They will seriously weaken data protection rights in the UK and will particularly harm people from marginalised communities.
So that I do not have to acknowledge them at every stage of the Bill, I will now thank a number of organisations. I am slightly taking advantage of the fact that our speeches are not limited but will be extremely limited from Monday onwards—the Minister will have 20 minutes; I, the noble Baroness, Lady Jones, and colleagues will have 15; and Back-Benchers will have 10. I suspect we are into a new era of brevity, but I will take advantage today, believe me. I thank Bates Wells, Big Brother Watch, Defend Digital Me, the Public Law Project, Open Rights Group, Justice, medConfidential, Chris Pounder, the Data & Marketing Association, CACI, Preiskel & Co, AWO, Rights and Security International, the Advertising Association, the National AIDS Trust, Connected by Data and the British Retail Consortium. That is a fair range of organisations that see flaws in the Bill. We on these Benches agree with them and believe that it greatly weakens the existing data protection framework. Our preference, as we expressed at Second Reading, is that the Bill is either completely revised on a massive scale or withdrawn in the course of its passage through the Lords.
I will mention one thing; I do not think the Government are making any great secret of it. The noble Baroness, Lady Kidron, drew my attention to the Keeling schedule, which gives the game away, and Section 2(2). The Information Commissioner will no longer have to pay regard to certain aspects of the protection of personal data—all the words have been deleted, which is quite extraordinary. It is clear that the Bill will dilute protections around personal data processing, reducing the scope of data protected by the safeguards within the existing law. In fact, the Bill gives more power to data users and takes it away from the people the data is about.
I am particularly concerned about the provisions that change the definition of personal data and the purposes for which it can be processed. There is no need to redraft the definitions of personal data, research or the boundaries of legitimate interests. We have made it very clear over a period of time that guidance from the ICO would have been adequate in these circumstances, rather than a whole piece of primary legislation. The recitals are readily available for guidance, and the Government should have used them. More data will be processed, with fewer safeguards than currently permitted, as it will no longer meet the threshold of personal data, or it will be permitted under the new recognised legitimate interest provision, which we will debate later. That combination is a serious threat to privacy rights in the UK, and that is the context of a couple of our probing amendments to Clause 1— I will come on to the clause stand part notice.
As a result of these government changes, data in one organisation’s hands may be anonymous, while that same information in another organisation’s hands can be personal data. The factor that determines whether personal data can be reidentified is whether the appropriate organisational measures and technical safeguards exist to keep the data in question separate from the identity of specific individuals. That is a very clear decision by the CJEU; the case is SRB v EDPS, if the Minister is interested.
The ability to identify an individual indirectly with the use of additional information is due to the lack of appropriate organisational and technical measures. If the organisation had such appropriate measures that separated data into differently silos, it would not be able to use the additional information to identify such an individual. The language of technical and organisational measures is used in the definition of pseudonymisation in Clause 1(3)(d), which refers to “indirectly identifiable” information. If such measures existed, the data would be properly pseudonymised, in which case it would no longer be indirectly identifiable.
A lot of this depends on how data savvy organisations are, so those that are not well organised and do not have the right technology will get a free pass. That cannot be right, so I hope the Minister will respond to that. We need to make sure that personal data remains personal data, even if some may claim it is not.
Regarding my Amendment 5, can the Government explicitly confirm that personal data that is
“pseudonymised in part, but in which other indirect identifiers remain unaltered”
will remain personal data after this clause is passed? Can the Government also confirm that if an assessment is made that some data is not personal data, but that assessment is later shown to be incorrect, the data will have been personal data at all times and should be treated as such by controllers, processors and the Information Commissioner, about whom we will talk when we come to the relevant future clauses.
Amendment 288 simply asks the Government for an impact assessment. If they are so convinced that the definition of personal data will change, they should be prepared to submit to some kind of impact assessment after the Bill comes into effect. Those are probing amendments, and it would be useful to know whether the Government have any intention to assess what the impact of their changes to the Bill would be if they were passed. More importantly, we believe broadly that Clause 1 is not fit for purpose, and that is why we have tabled the clause stand part notice.
As we said, this change will erode people’s privacy en masse. The impacts could include more widespread use of facial recognition and an increase in data processing with minimal safeguards in the context of facial recognition, as the threshold for personal data would be met only if the data subject is on a watchlist and therefore identified. If an individual is not on a watchlist and images are deleted after checking it, the data may not be considered personal and so would not qualify for data protection obligations.
People’s information could be used to train AI without their knowledge or consent. Personal photos scraped from the internet and stored to train an algorithm would no longer be seen as personal data, as long as the controller does not recognise the individual, is not trying to identify them and will not process the data in such a way that would identify them. The police would have increased access to personal information. Police and security services will no longer have to go to court if they want access to genetic databases; they will be able to access the public’s genetic information as a matter of routine.
Personal data should be defined by what type of data it is, not by how easy it is for a third party to identify an individual from it. That is the bottom line. Replacing a stable, objective definition that grants rights to the individual with an unstable, subjective definition that determines the rights an individual has over their data according to the capabilities of the processor is illogical, complex, bad law-making. It is contrary to the very premise of data protection law, which is founded upon personal data rights. We start on the wrong foot in Clause 1, and it continues. I beg to move.
My Lords, I rise to speak in favour of Amendments 1 and 5 in this group and with sympathy towards Amendment 4. The noble Lord, Lord Clement-Jones, will remember when I was briefly Minister for Health. We had lots of conversations about health data. One of the things we looked at was a digitised NHS. It was essential if we were to solve many problems of the future and have a world-class NHS, but the problem was that we had to make sure that patients were comfortable with the use of their data and the contexts in which it could be used.
When we were looking to train AI, it was important that we made sure that the data was as anonymous as possible. For example, we looked at things such as synthetic and pseudonymised data. There is another point: having done the analysis and looked at the dataset, if you see an identifiable group of people who may well be at risk, how can you reverse-engineer that data perhaps to notify those patients that they should be contacted for further medical interventions?
I know that that makes it far too complicated; I just wanted to rise briefly to support the noble Lord, Lord Clement-Jones, on this issue, before the new rules come in next week. It is essential that the users, the patients—in other spheres as well—have absolute confidence that their data is theirs and are given the opportunity to give permission or opt out as much as possible.
One of the things that I said when I was briefed as a Health Minister was that we can have the best digital health system in the world, but it is no good if people choose to opt out or do not have confidence. We need to make sure that the Bill gives those patients that confidence where their data is used in other areas. We need to toughen this bit up. That is why I support Amendments 1 and 5 in the name of the noble Lord, Lord Clement-Jones.
My Lords, I thank the noble Lords, Lord Kamall, Lord Davies of Brixton and Lord Bassam, and the noble Baroness, Lady Harding, for their support for a number of these amendments. Everybody made a common point about public trust, particularly in the context of health data.
As the noble Lord, Lord Kamall, said, we had a lot of conversations during the passage of the Health and Care Act and the noble Lord and his department increasingly got it: proper communication about the use of personal, patient data is absolutely crucial to public trust. We made quite a bit of progress with NHSE and the department starting to build in safeguards and develop the concept of access to, rather than sharing of, personal data. I heard what the noble Lord, Lord Davies, said about a locked box and I think that having access for research, rather than sharing data around, is a powerful concept.
I found what the Minister said to be helpful. I am afraid that we will have to requisition a lot of wet towels during the passage of the Bill. There are a number of aspects to what he said, but the bottom line is that he is saying that there is no serious divergence from the current definition of personal data. The boot is on the other foot: where is the Brexit dividend? The Minister cannot have it both ways.
I am sure that, as we go through this and the Minister says, “It’s all in recital 26”, my response would be that the ICO could easily develop guidance based on that. That would be splendid; we would not have to go through the agony of contending with this data protection Bill. It raises all those issues and creates a great deal of angst. There are 26 organisations, maybe more— 42, I think—writing to the Secretary of State about one aspect of it or another. The Government have really created a rod for their own back, when they could have created an awful lot of guidance, included a bit on digital identity in the Bill and done something on cookies. What else is there not to like? As I say, the Government have created a rod for their own back.
As regards pseudonymised data, that is also helpful. We will hold the Minister to that as we go through, if the Minister is saying that that is personal data. I am rather disappointed by the response to Amendment 5, but I will take a very close look at it with several wet towels.
We never know quite whether CJEU judgments will be treated as precedent by this Government or where we are under the REUL Act. I could not tell you at this moment. However, it seems that the Minister is again reassuring us that the CJEU’s judgments on personal data are valid and are treated as being part of UK law for this purpose, which is why there is no change to the definition of personal data as far as he is concerned. All he is doing is importing the recitals into Clause 1. I think I need to read the Minister’s speech pretty carefully if I am going to accept that. In the meantime, we move on. I beg leave to withdraw the amendment.
My Lords, in the nearly nine years that I have been in this House, I have often played the role of bag carrier to the noble Baroness, Lady Kidron, on this issue. In many ways, I am rather depressed that once again we need to make the case that children deserve a higher bar of protection than adults in the digital world. As the noble Baroness set out—I will not repeat it—the age-appropriate design code was a major landmark in establishing that you can regulate the digital world just as you can the physical world. What is more, it is rather joyful that when you do, these extraordinarily powerful tech companies change their products in the way that you want them to.
This is extremely hard-fought ground that we must not lose. It takes us to what feels like a familiar refrain from the Online Safety Act and the Digital Markets, Competition and Consumers Bill, which we are all still engaged in: the question of whether you need to write something in the Bill and whether, by doing so, you make it more clear or less clear.
Does my noble friend the Minister agree with the fundamental principle, enshrined in the Data Protection Act 2018, that children deserve a higher bar of protection in the online world and that children’s data needs to be protected at a much higher level? If we can all agree on that principle first, then the question is: how do we make sure that this Bill does not weaken the protection that children have?
I am trying to remember on which side of the “put it in the Bill or not” debate I have been during discussions on each of the digital Bills that we have all been working on over the last couple of years. We have a really vicious problem where, as I understand it, the Government keep insisting that the Bill does not water down data protection and therefore there is no need to write anything into it to protect children’s greater rights. On the other hand, I also hear that it will remove bureaucracy and save businesses a lot of money. I have certainly been in rooms over the last couple of years where business representatives have told me, not realising I was one of the original signatories to the amendment that created the age-appropriate design code, how dreadful it was because it made their lives much more complicated.
I have no doubt that if we create a sense—which is what it is—that companies do not need to do quite as much as they used to for children in this area, that sense will create, if not a wide-open door, an ajar door that enables businesses to walk through and take the path of least resistance, which is doing less to protect children. That is why, in this case, I come down on the side of wanting to put it explicitly in the Bill, in whatever wording my noble friend the Minister thinks appropriate, that we are really clear that this creates no change at all in the approach for children and children’s data.
That is what this group of amendments is about. I know that we will come back to a whole host of other areas where there is a risk that children’s data could be handled differently from the way envisaged in that hard-fought battle for the age-appropriate design code but, on this group alone, it would be helpful if my noble friend the Minister could help us establish that firm principle and commit to coming back with wording that will firmly establish it in the Bill.
My Lords, I keep getting flashbacks. This one is to the Data Protection Act 2018, although I think it was 2017 when we debated it. It is one of the huge achievements of the noble Baroness, Lady Kidron, to have introduced, and persuaded the Government to introduce, the age-appropriate design code into the Act, and—as she and the noble Baroness, Lady Harding, described—to see it spread around the world and become the gold standard. It is hardly surprising that she is so passionate about wanting to make sure that the Bill does not water down the data rights of children.
I think the most powerful amendment in this group is Amendment 290. For me, it absolutely bottles what we need to do in making sure that nothing in the Bill waters down children’s rights. If I were to choose one of the noble Baroness’s amendments in this group, it would be that one: it would absolutely give the assurance and scotch the point about legal uncertainty created by the Bill.
Both noble Baronesses asked: if the Government are not watering down the Bill, why can they not say that they are not? Why can they not, in a sense, repeat the words of Paul Scully when he was debating the Bill? He said:
“We are committed to protecting children and young people online. The Bill maintains the high standards of data protection that our citizens expect and organisations will still have to abide by our age-appropriate design code”.
He uses “our”, so he is taking full ownership of it. He went on:
“Any breach of our data protection laws will result in enforcement action by the Information Commissioner’s Office”.—[Official Report, Commons, 17/4/23; col. 101.]
I would love that enshrined in the Bill. It would give us a huge amount of assurance.
My Lords, we on the Labour Benches have become co-signatories to the amendments tabled by the noble Baroness, Lady Kidron, and supported by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding. The noble Baroness set out very clearly and expertly the overarching purpose of retaining the level of protection currently afforded by the Data Protection Act 2018. Amendments 2 and 3 specifically stipulate that, where data controllers know, or should reasonably know, that a user is a child, they should be given the data protection codified in that Act. Amendment 9 takes it a stage further and includes children’s data in the definition of sensitive personal data, and gives it the benefit of being treated to a heightened level of protection—quite rightly, too. Finally, Amendment 290—the favourite of the noble Lord, Lord Clement-Jones—attempts to hold Ministers to the commitment made by Paul Scully in the Commons to maintain existing standards of data protection carried over from that 2018 Act.
Why is all this necessary? I suspect that the Minister will argue that it is not needed because Clause 5 already provides for the Secretary of State to consider the impact of any changes to the rights and freedoms of individuals and, in particular, of children, who require special protection.
We disagree with that argument. In the interests of brevity and the spirit of the recent Procedure Committee report, which says that we should not repeat each other’s arguments, I do not intend to speak at length, but we have a principal concern: to try to understand why the Government want to depart from the standards of protection set out in the age-appropriate design code—the international gold standard—which they so enthusiastically signed up to just five or six years ago. Given the rising levels of parental concern over harmful online content and well-known cases highlighting the harms that can flow from unregulated material, why do the Government consider it safe to water down the regulatory standards at this precise moment in time? The noble Baroness, Lady Kidron, valuably highlighted the impact of the current regulatory framework on companies’ behaviour. That is exactly what legislation is designed to do: to change how we look at things and how we work. Why change that? As she has argued very persuasively, it is and has been hugely transformative. Why throw away that benefit now?
My attention was drawn to one example of what can happen by a briefing note from the 5Rights Foundation. As it argued, children are uniquely vulnerable to harm and risk online. I thought its set of statistics was really interesting. By the age of 13, 72 million data points have already been collected about children. They are often not used in children’s best interests; for example, the data is often used to feed recommender systems and algorithms designed to keep attention at all costs and have been found to push harmful content at children.
When this happens repeatedly over time, it can have catastrophic consequences, as we know. The coroner in the Molly Russell inquest found that she had been recommended a stream of depressive content by algorithms, leading the coroner to rule that she
“died from an act of self-harm whilst suffering from depression and the negative effects of online content”.
We do not want more Molly Russell cases. Progress has already been made in this field; we should consider dispensing with it at our peril. Can the Minister explain today the thinking and logic behind the changes that the Government have brought forward? Can he estimate the impact that the new lighter-touch regime, as we see it, will have on child protection? Have the Government consulted extensively with those in the sector who are properly concerned about child protection issues, and what sort of responses have the Government received?
Finally, why have the Government decided to take a risk with the sound framework that was already in place and built on during the course of the Online Safety Act? We need to hear very clearly from the Minister how they intend to engage with groups that are concerned about these child protection issues, given the apparent loosening of the current framework. The noble Baroness, Lady Harding, said that this is hard-fought ground; we intend to continue making it so because these protections are of great value to our society.
I believe it restates what the Government feel is clearly implied or stated throughout the Bill: that children’s safety is paramount. Therefore, putting it there is either duplicative or confusing; it reduces the clarity of the Bill. In no way is this to say that children are not protected—far from it. The Government feel it would diminish the clarity and overall cohesiveness of the Bill to include it.
My Lords, not to put too fine a point on it, the Minister is saying that nothing in the Bill diminishes children’s rights, whether in Clause 1, Clause 6 or the legitimate interest in Clause 5. He is saying that absolutely nothing in the Bill diminishes children’s rights in any way. Is that his position?
Can I add to that question? Is my noble friend the Minister also saying that there is no risk of companies misinterpreting the Bill’s intentions and assuming that this might be some form of diminution of the protections for children?
My Lords, I am going to get rather used to introducing a smorgasbord of probing amendments and stand part notices throughout most of the groups of amendments as we go through them. Some of them try to find out the meaning of areas in the Bill and others are rather more serious and object to whole clauses.
I am extremely sympathetic to the use of personal data for research purposes, but Clause 2, which deals with research, is rather deceptive in many ways. That is because “scientific research” and “scientific research purposes” will now be defined to mean
“any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity”.
The rub lies in the words “commercial or non-commercial activity”. A loosening of requirements on purpose limitation will assist commercial and non-commercial organisations in research and reusing personal data obtained from third parties but will do nothing to increase protection for individual data subjects in these circumstances. That is the real Pandora’s box that we are opening as regards commercial activity. It opens the door to Meta to use our personal data for its own purposes under the guise of research. That seems very much to be a backward step. That is why I tabled Amendment 6, which would require the public interest to apply to all uses under this clause, not just public health uses.
Then there is the question of consent under Clause 3. How is the lawful and moral right of patients, constituents or data subjects to dissent from medical research, for instance, enshrined in this clause? We have seen enough issues relating to health data, opt-outs and so on to begin to destroy public trust, if we are not careful. We have to be extremely advertent to the fact that the communications have to be right; there has to be the opportunity to opt out.
In these circumstances, Amendment 7 would provide that a data subject has been given the opportunity to express dissent or an objection and has not so expressed it. That is then repeated in Clause 26. Again, we are back to public trust: we are not going to gain it. I am very much a glass-half-full person as far as new technology, AI and the opportunities for the use of patient data in the health service are concerned. I am an enthusiast for that, but it has to be done in the right circumstances.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for this series of amendments.
I will first address Amendment 6, which seeks to amend Clause 2. As the noble Lord said, the definitions created by Clause 2, including “scientific research purposes”, are based on the current wording in recital 159 to the UK GDPR. We are changing not the scope of these definitions but their legal status. This amendment would require individual researchers to assess whether their research should be considered to be in the public interest, which could create uncertainty in the sector and discourage research. This would be more restrictive than the current position and would undermine the Government’s objectives to facilitate scientific research and empower researchers.
We have maintained a flexible scope as to what is covered by “scientific research” while ensuring that the definition is still sufficiently narrow in that it can cover only what would reasonably be seen as scientific research. This is because the legislation needs to be able to adapt to the emergence of new areas of innovative research. Therefore, the Government feel that it is more appropriate for the regulator to add more nuance and context to the definition. This includes the types of processing that are considered—
I am sorry to interrupt but it may give the Box a chance to give the Minister a note on this. Is the Minister saying that recital 159 includes the word “commercial”?
I am afraid I do not have an eidetic memory of recital 159, but I would be happy to—
That is precisely why I ask this question in the middle of the Minister’s speech to give the Box a chance to respond, I hope.
Researchers must also comply with the required safeguards to protect individuals’ privacy. All organisations conducting scientific research, including those with commercial interests, must also meet all the safeguards for research laid out in the UK GDPR and comply with the legislation’s core principles, such as fairness and transparency. Clause 26 sets out several safeguards that research organisations must comply with when processing personal data for research purposes. The ICO will update its non-statutory guidance to reflect many of the changes introduced by this Bill.
Scientific research currently holds a privileged place in the data protection framework because, by its nature, it is already viewed as generally being in the public interest. As has been observed, the Bill already applies a public interest test to processing for the purpose of public health studies in order to provide greater assurance for research that is particularly sensitive. Again, this reflects recital 159.
In response to the noble Baroness, Lady Jones, on why public health research is being singled out, as she stated, this part of the legislation just adds an additional safeguard to studies into public health ensuring that they must be in the public interest. This does not limit the scope for other research unrelated to public health. Studies in the area of public health will usually be in the public interest. For the rare, exceptional times that a study is not, this requirement provides an additional safeguard to help prevent misuse of the various exemptions and privileges for researchers in the UK GDPR. “Public interest” is not defined in the legislation, so the controller needs to make a case-by-case assessment based on its purposes.
On the point made by the noble Lord, Lord Clement-Jones, about recitals and ICO guidance, although we of course respect and welcome ICO guidance, it does not have legislative effect and does not provide the certainty that legislation does. That is why we have done so via this Bill.
Amendment 7 to Clause 3 would undermine the broader consent concept for scientific research. Clause 3 places the existing concept of “broad consent” currently found in recital 33 to the UK GDPR on a statutory footing with the intention of improving awareness and confidence for researchers. This clause applies only to scientific research processing that is reliant on consent. It already contains various safeguards. For example, broad consent can be used only where it is not possible to identify at the outset the full purposes for which personal data might be processed. Additionally, to give individuals greater agency, where possible individuals will have the option to consent to only part of the processing and can withdraw their consent at any time.
Clause 3 clarifies an existing concept of broad consent which outlines how the conditions for consent will be met in certain circumstances when processing for scientific research purposes. This will enable consent to be obtained for an area of scientific research when researchers cannot at the outset identify fully the purposes for which they are collecting the data. For example, the initial aim may be the study of cancer, but it later becomes the study of a particular cancer type.
Furthermore, as part of the reforms around the reuse of personal data, we have further clarified that when personal data is originally collected on the basis of consent, a controller would need to get fresh consent to reuse that data for a new purpose unless a public interest exemption applied and it is unreasonable to expect the controller to obtain that consent. A controller cannot generally reuse personal data originally collected on the basis of consent for research purposes.
Turning to Amendments 132 and 133 to Clause 26, the general rule described in Article 13(3) of the UK GDPR is that controllers must inform data subjects about a change of purposes, which provides an opportunity to withdraw consent or object to the proposed processing where relevant. There are existing exceptions to the right to object, such as Article 21(6) of the UK GDPR, where processing is necessary for research in the public interest, and in Schedule 2 to the Data Protection Act 2018, when applying the right would prevent or seriously impair the research. Removing these exemptions could undermine life-saving research and compromise long-term studies so that they are not able to continue.
Regarding Amendment 134, new Article 84B of the UK GDPR already sets out the requirement that personal data should be anonymised for research, archiving and statistical—RAS—purposes unless doing so would mean the research could not be carried through. Anonymisation is not always possible as personal data can be at the heart of valuable research, archiving and statistical activities, for example, in genetic research for the monitoring of new treatments of diseases. That is why new Article 84C of the UK GDPR also sets out protective measures for personal data that is used for RAS purposes, such as ensuring respect for the principle of data minimisation through pseudonymisation.
The stand part notice in this group seeks to remove Clause 6 and, consequentially, Schedule 2. In the Government’s consultation on data reform, Data: A New Direction, we heard that the current provisions in the UK GDPR on personal data reuse are difficult for controllers and individuals to navigate. This has led to uncertainty about when controllers can reuse personal data, causing delays for researchers and obstructing innovation. Clause 6 and Schedule 2 address the existing uncertainty around reusing personal data by setting out clearly the conditions in which the reuse of personal data for a new purpose is permitted. Clause 6 and Schedule 2 must therefore remain to give controllers legal certainty and individuals greater transparency.
Amendment 22 seeks to remove the power to add to or vary the conditions set out in Schedule 2. These conditions currently constitute a list of specific public interest purposes, such as safeguarding vulnerable individuals, for which an organisation is permitted to reuse data without needing consent or to identify a specific law elsewhere in legislation. Since this list is strictly limited and exhaustive, a power is needed to ensure that it is kept up to date with future developments in how personal data is used for important public interest purposes.
With respect to recital 38, that sounds like a really interesting idea. Yes, let us both have a look and see what the consultation involves and what the timing might look like. I confess to the Committee that I do not know what recital 38 says, off the top of my head. For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore not press them.
Returning to the questions by the noble Lord, Lord Clement-Jones, on the contents of recital 159, the current UK GDPR and EU GDPR are silent on the specific definition of scientific research. It does not preclude commercial organisations performing scientific research; indeed, the ICO’s own guidance on research and its interpretation of recital 159 already mention commercial activities. Scientific research can be done by commercial organisations—for example, much of the research done into vaccines, and the research into AI referenced by the noble Baroness, Lady Harding. The recital itself does not mention it but, as the ICO’s guidance is clear on this already, the Government feel that it is appropriate to put this on a statutory footing.
My Lords, that was intriguing. I thank the Minister for his response. It sounds as though, again, guidance would have been absolutely fine, but what is there not to like about the ICO bringing clarity? It was quite interesting that the Minister used the phrase “uncertainty in the sector” on numerous occasions and that is becoming a bit of a mantra as the Bill goes on. We cannot create uncertainty in the sector, so the poor old ICO has been labouring in the vineyard for the last few years to no purpose at all. Clearly there has been uncertainty in the sector of a major description, and all its guidance and all the work that it has put in over the years have been wholly fruitless, really. It is only this Government that have grabbed the agenda with this splendid 300-page data protection Bill that will clarify this for business. I do not know how much they will have to pay to get new compliance officers or whatever it happens to be, but the one thing that the Bill will absolutely not create is greater clarity.
I am a huge fan of making sure that we understand what the recitals have to say, and it is very interesting that the Minister is saying that the recital is silent but the ICO’s guidance is pretty clear on this. I am hugely attracted by the idea of including recital 38 in the Bill. It is another lightbulb moment from the noble Baroness, Lady Kidron, who has these moments, rather like with the age-appropriate design code, which was a huge one.
We are back to the concern, whether in the ICO guidance, the Bill or wherever, that scientific research needs to be in the public interest to qualify and not have all the consents that are normally required for the use of personal data. The Minister said, “Well, of course we think that scientific research is in the public interest; that is its very definition”. So why does only public health research need that public interest test and not the other aspects? Is it because, for instance, the opt-out was a bit of a disaster and 3 million people opted out of allowing their health data to be shared or accessed by GPs? Yes, it probably is.
Do the Government want a similar kind of disaster to happen, in which people get really excited about Meta or other commercial organisations getting hold of their data, a public outcry ensues and they therefore have to introduce a public interest test on that? What is sauce for the goose is sauce for the gander. I do not think that personal data should be treated in a particularly different way in terms of its public interest, just because it is in healthcare. I very much hope that the Minister will consider that.
My Lords, I hope this is another lightbulb moment, as the noble Lord, Lord Clement-Jones, suggested. As well as Amendment 10, I will speak to Amendments 35, 147 and 148 in my name and the names of the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones. I thank them both. The purpose of these amendments is to move the Bill away from nibbling around the edges of GDPR in pursuit of post-Brexit opportunities and to actually deliver a post-Brexit opportunity.
These amendments would put the UK on an enhanced path of data sophistication while not challenging equivalence, which we will undoubtedly discuss during the Committee. I echo the voice of the noble Lord, Lord Allan, who at Second Reading expressed deep concern that equivalence was not a question of an arrangement between the Government and the EU but would be a question picked up by data activists taking strategic litigation to the courts.
Data protection as conceived by GDPR and in this Bill is primarily seen as an arrangement between an individual and an entity that processes that data—most often a commercial company. But, as evidenced by the last 20 years, the real power lies in holding either vast swathes of general data, such as those used by LLMs, or large groups of specialist data such as medical scans. In short, the value—in all forms, not simply financial—lies in big data.
As the value of data became clear, ideas such as “data is the new oil” and data as currency emerged, alongside the notion of data fiduciaries or data trusts, where you can place your data collectively. One early proponent of such ideas was Jaron Lanier, inventor of virtual reality; I remember discussing it with him more than a decade ago. However, these ideas have not found widespread practical application, possibly because they are normally based around ideas of micropayments as the primary value—and very probably because they rely on data subjects gathering their data, so they are for the boffins.
During the passage of the DPA 2018, one noble Lord counted the number of times the Minister said the words “complex” and “complicated” while referring to the Bill. Data law is complex, and the complicated waterfall of its concepts and provisions eludes most non-experts. That is why I propose the four amendments in this group, which would give UK citizens access to data experts for matters that concern them deeply.
Amendment 10 would define the term “data community”, and Amendment 35 would give a data subject the power to assign their data rights to a data community for specific purposes and for a specific time period. Amendment 147 would require the ICO to set out a code of conduct for data communities, including guidance on establishing, operating and joining a data community, as well as guidance for data controllers and data processors on responding to requests made by data communities. Amendment 148 would require the ICO to keep a register of data communities, to make it publicly available and to ensure proper oversight. Together, they would provide a mechanism for non-experts—that is, any UK citizen—to assign their data rights to a community run by representatives that would benefit the entire group.
Data communities diverge from previous attempts to create big data for the benefit of users, in that they are not predicated on financial payments and neither does each data subject need to access their own data via the complex rules and often obstructive interactions with individual companies. They put rights holders together with experts who do it on their behalf, by allowing data subjects to assign their rights so that an expert can gather the data and crunch it.
This concept is based on a piece of work done by a colleague of mine at the University of Oxford, Dr Reuben Binns, an associate professor in human-centred computing, in association with the Worker Info Exchange. Since 2016, individual Uber drivers, with help from their trade unions and the WIE, asked Uber for their data that showed their jobs, earnings, movements, waiting times and so on. It took many months of negotiation, conducted via data protection lawyers, as each driver individually asked for successive pieces of information that Uber, at first, resisted giving them and then, after litigation, provided.
After a period of time, a new cohort of drivers was recruited, and it was only when several hundred drivers were poised to ask the same set of questions that a formal arrangement was made between Uber and WIE, so that they could be treated as a single group and all the data would be provided about all the drivers. This practical decision allowed Dr Binns to look at the data en masse. While an individual driver knew what they earned and where they were, what became visible when looking across several hundred drivers is how the algorithm reacted to those who refused a poorly paid job, who was assigned the lucrative airport runs, whether where you started impacted on your daily earnings, whether those who worked short hours were given less lucrative jobs, and so on.
This research project continues after several years and benefits from a bespoke arrangement that could, by means of these amendments, be strengthened and made an industry-wide standard with the involvement of the ICO. If it were routine, it would provide opportunity equally for challenger businesses, community groups and research projects. Imagine if a group of elderly people who spend a lot of time at home were able to use a data community to negotiate cheap group insurance, or imagine a research project where I might assign my data rights for the sole purpose of looking at gender inequality. A data community would allow any group of people to assign their rights, rights that are more powerful together than apart. This is doable—I have explained how it has been done. With these amendments, it would be routinely available, contractual, time-limited and subject to a code of conduct.
As it stands, the Bill is regressive for personal data rights and does not deliver the promised Brexit dividends. But there are great possibilities, without threatening adequacy, that could open markets, support innovation in the UK and make data more available to groups in society that rarely benefit from data law. I beg to move.
My Lords, I think this is a lightbulb moment—it is inspired, and this suite of amendments fits together really well. I entirely agree with the noble Baroness, Lady Kidron, that this is a positive aspect. If the Bill contained these four amendments, I might have to alter my opinion of it—how about that for an incentive?
This is an important subject. It is a positive aspect of data rights. We have not got this right yet in this country. We still have great suspicion about sharing and access to personal data. There is almost a conspiracy theory around the use of data, the use of external contractors in the health service and so on, which is extremely unhelpful. If individuals were able to share their data with a trusted hub—a trusted community—that would make all the difference.
Like the noble Baroness, Lady Kidron, I have come across a number of influences over the years. I think the first time many of us came across the idea of data trusts or data institutions was in the Hall-Pesenti review carried out by Dame Wendy Hall and Jérôme Pesenti in 2017. They made a strong recommendation to the Government that they should start thinking about how to operationalise data trusts. Subsequently, organisations such as the Open Data Institute did some valuable research into how data trusts and data institutions could be used in a variety of ways, including in local government. Then the Ada Lovelace Institute did some very good work on the possible legal basis for data trusts and data institutions. Professor Irene Ng was heavily engaged in setting up what was called the “hub of all things”. I was not quite convinced by how it was going to work legally in terms of data sharing and so on, but in a sense we have now got to that point. I give all credit to the academic whom the noble Baroness mentioned. If he has helped us to get to this point, that is helpful. It is not that complicated, but we need full government backing for the ICO and the instruments that the noble Baroness put in her amendments, including regulatory oversight, because it will not be enough simply to have codes that apply. We have to have regulatory oversight.
I thank the noble Baroness, Lady Kidron, for raising this interesting and compelling set of ideas. I turn first to Amendments 10 and 35 relating to data communities. The Government recognise that individuals need to have the appropriate tools and mechanisms to easily exercise their rights under the data protection legislation. It is worth pointing out that current legislation does not prevent data subjects authorising third parties to exercise certain rights. Article 80 of the UK GDPR also explicitly gives data subjects the right to appoint not-for-profit bodies to exercise certain rights, including their right to bring a complaint to the ICO, to appeal against a decision of the ICO or to bring legal proceedings against a controller or processor and the right to receive compensation.
The concept of data communities exercising certain data subject rights is closely linked with the wider concept of data intermediaries. The Government recognise the existing and potential benefits of data intermediaries and are committed to supporting them. However, given that data intermediaries are new, we need to be careful not to distort the sector at such an early stage of development. As in many areas of the economy, officials are in regular contact with businesses, and the data intermediary sector is no different. One such engagement is the DBT’s Smart Data Council, which includes a number of intermediary businesses that advise the Government on the direction of smart data policy. The Government would welcome further and continued engagement with intermediary businesses to inform how data policy is developed.
I am sorry, but the Minister used a pretty pejorative word: “distort” the sector. What does he have in mind?
I did not mean to be pejorative; I merely point out that before embarking on quite a far-reaching policy—as noble Lords have pointed out—we would not want to jump the gun prior to consultation and researching the area properly. I certainly do not wish to paint a negative portrait.
It is a moment at which I cannot set a firm date for a firm set of actions, but on the other hand I am not attempting to punt it into the long grass either. The Government do not want to introduce a prescriptive framework without assessing potential risks, strengthening the evidence base and assessing the appropriate regulatory response. For these reasons, I hope that for the time being the noble Baroness will not press these amendments.
The noble Baroness has also proposed Amendments 147 and 148 relating to the role of the Information Commissioner’s Office. Given my response just now to the wider proposals, these amendments are no longer necessary and would complicate the statute book. We note that Clause 35 already includes a measure that will allow the Secretary of State to request the Information Commissioner’s Office to publish a code on any matter that she or he sees fit, so this is an issue we could return to in future if such a code were deemed necessary.
My Lords, I am sorry to keep interrupting the Minister. Can he give us a bit of a picture of what he has in mind? He said that he did not want to distort things at the moment, that there were intermediaries out there and so on. That is all very well, but is he assuming that a market will be developed or is developing? What overview of this does he have? In a sense, we have a very clear proposition here, which the Government should respond to. I am assuming that this is not a question just of letting a thousand flowers bloom. What is the government policy towards this? If you look at the Hall-Pesenti review and read pretty much every government response—including to our AI Select Committee, where we talked about data trusts and picked up the Hall-Pesenti review recommendations —you see that the Government have been pretty much positive over time when they have talked about data trusts. The trouble is that they have not done anything.
Overall, as I say and as many have said in this brief debate, this is a potentially far-reaching and powerful idea with an enormous number of benefits. But the fact that it is far-reaching implies that we need to look at it further. I am afraid that I am not briefed on long-standing—
May I suggest that the Minister writes? On the one hand, he is saying that we will be distorting something—that something is happening out there—but, on the other hand, he is saying that he is not briefed on what is out there or what the intentions are. A letter unpacking all that would be enormously helpful.
I am very happy to write on this. I will just say that I am not briefed on previous government policy towards it, dating back many years before my time in the role.
It was even further. Yes, I am very happy to write on that. For the reasons I have set out, I am not able to accept these amendments for now. I therefore hope that the noble Baroness will withdraw her amendment.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Science, Innovation & Technology
(9 months ago)
Grand CommitteeMy Lords, I rise to speak to my Amendment 11 and to Amendments 14, 16, 17, 18, Clause 5 stand part and Clause 7 stand part. I will attempt to be as brief as I can, but Clause 5 involves rather a large number of issues.
Processing personal data is currently lawful only if it is performed for at least one lawful purpose, one of which is that the processing is for legitimate interests pursued by the controller or a third party, except where those interests are overridden by the interests or fundamental rights of the data subject. As such, if a data controller relies on their legitimate interest as a legal basis for processing data, they must conduct a balancing test of their interest and those of the data subject.
Clause 5 amends the UK GDPR’s legitimate interest provisions by introducing the concept of recognised legitimate interest, which allows data to be processed without a legitimate interest balancing test. This provides businesses and other organisations with a broader scope of justification for data processing. Clause 5 would amend Article 6 of the UK GDPR to equip the Secretary of State with a power to determine these new recognised legitimate interests. Under the proposed amendment, the Secretary of State must have regard to,
“among other things … the interests and fundamental rights and freedoms of data subjects”.
The usual legitimate interest test is much stronger: rather than merely a topic to have regard to, a legitimate interest basis cannot lawfully apply if the data subject’s interests override those of the data controller.
Annexe 1, as inserted by the Bill, now provides a list of exemptions but is overly broad and vague. It includes national security, public security and defence, and emergencies and crime as legitimate interests for data processing without an assessment. Conservative MP, Marcus Fysh, said on Third Reading:
“Before companies share data or use data, they should have to think about what the balance is between a legitimate interest and the data rights, privacy rights and all the other rights that people may have in relation to their data. We do not want to give them a loophole or a way out of having to think about that.” —[Official Report, Commons, 29/11/23; col. 896.]
I entirely agree with that.
The amendment in Clause 5 also provides examples of processing that may be considered legitimate interests under the existing legitimate interest purpose, under Article 6(1)(f), rather than under the new recognised legitimate interest purpose. These include direct marketing, intra-group transmission of personal data for internal administrative purposes, and processing necessary to ensure the security of a network.
The Bill also provides a much more litigious data environment. Currently, an organisation’s assessment of its lawful purposes for processing data can be challenged through correspondence or an ICO complaint, whereas, under the proposed system, an individual may be forced to legally challenge a statutory instrument in order to contest the basis on which their data is processed.
As I will explain later, our preference is that the clause not stand part, but I accept that there are some areas that need clarification and Amendment 11 is designed to do this. The UK GDPR sets out conditions in which processing of data is lawful. The Bill inserts in Article 6(1) a provision specifying that processing shall be lawful for the purposes of a recognised legitimate interest, as I referred to earlier, an example of which may be for the purposes of direct marketing.
Many companies obtain data from the open electoral register. The register is maintained by local authorities, which have the right to sell this data to businesses. Amendment 11 would insert new Article (6)(1)(aa) and (ab), which provide that data processing shall be lawful where individuals have consented for their data
“to enter the public domain via a public body”,
or where processing is carried out by public bodies pursuant to their duties and rights, which may include making such data available to the public. Individuals are free to opt out of the open electoral register if they so wish and it would be disproportionate—in fact, irritating—to consumers to notify those who have consented to their data being processed that their data is being processed.
On Amendment 14, as mentioned, the Bill would give the Secretary of State the power to determine recognised legitimate interests through secondary legislation, which is subject to minimal levels of parliamentary scrutiny. Although the affirmative procedure is required, this does not entail much scrutiny or much of a debate. The last time MPs did not approve a statutory instrument under the affirmative procedure was in 1978. In practice, interests could be added to this list at any time and for any reason, facilitating the flow and use of personal data for limitless potential purposes. Businesses could be obligated to share the public’s personal data with government or law enforcement agencies beyond what they are currently required to do, all based on the Secretary of State’s inclination at the time.
We are concerned that this Henry VIII power is unjustified and undermines the very purpose of data protection legislation, which is to protect the privacy of individuals in a democratic data environment, as it vests undue power over personal data rights in the Executive. This amendment is designed to prevent the Secretary of State from having the ability to pre-authorise data processing outside the usual legally defined route. It is important to avoid a two-tier data protection framework in which the Secretary of State can decide that certain processing is effectively above the law.
On Amendment 17, some of the most common settings where data protection law is broken relate to the sharing of HIV status of an individual living with HIV in their personal life in relation to employment, healthcare services and the police. The sharing of an individual’s HIV status can lead to further discrimination being experienced by people living with HIV and can increase their risk of harassment or even violence. The National AIDS Trust is concerned that the Bill as drafted does not go far enough to prevent individuals’ HIV status from being shared with others without their consent. They and we believe that the Bill must clarify what an “administrative purpose” is for organisations processing employees’ personal data. Amendment 17 would add wording to clarify that, in paragraph 9(b) of Article 6,
“intra-group transmission of personal data”
in the workplace, within an organisation or in a group of organisations should be permitted only for individuals who need to access an employee’s personal data as part of their work.
As far as Amendment 18 is concerned, as it stands Clause 5 gives an advantage to large undertakings with numerous companies that can transmit data intra-group purely because they are affiliated to one central body. However, this contradicts both the ICO’s and the CMA’s repeated position that first party versus third party is not a meaningful distinction to cover privacy risk. Instead, it is the distinction of what data is processed, rather than the corporate ownership of the systems doing the processing. The amendment reflects the organisational measures that undertakings should have as safeguards. The groups of undertakings transmitting data should have organisational measures via contract to be able to take advantage of this transmission of data.
Then we come to the question of Clause 5 standing part of the Bill. This clause is unnecessary and creates risks. It is unnecessary because the legitimate interest balancing test is, in fact, flexible and practical; it already allows processing for emergencies, safeguarding and so on. It is risky because creating lists of specified legitimate interests inevitably narrows this concept and may make controllers less certain about whether a legitimate interest that is not a recognised legitimate interest can be characterised as such. In the age of AI, where change is exponential, we need principles and outcome-based legislation that are flexible and can be supplemented with guidance from an independent regulator, rather than setting up a system that requires the Government to legislate more and faster in order to catch up.
There is also a risk that the drafting of this provision does not dispense with the need to conduct a legitimate interest balancing test because all the recognised legitimate interests contain a test, of necessity. Established case law interprets the concept of necessity under data protection law as requiring a human rights balancing test to be carried out. This rather points to the smoke-and-mirrors effect of this drafting, which does nothing to improve legal certainty for organisations or protections for individuals.
I now come to Clause 7 standing part. This clause creates a presumption that processing will always be in the public interest or substantial public interest if done in reliance on a condition listed in proposed new Schedule A1 to the Data Protection Act 2018. The schedule will list international treaties that have been ratified by the UK. At present, the Bill lists only the UK-US data-sharing agreement as constituting relevant international law. Clause 7 seeks to remove the requirement for a controller to consider whether the legal basis on which they rely is in the public interest or substantial public interest, has appropriate safeguards and respects data subjects’ fundamental rights and freedoms. But the conditions in proposed new Schedule A1 in respect of the UK-US agreement also state that the processing must be necessary, as assessed by the controller, to respond to a request made under the agreement.
It is likely that a court would interpret “necessity” in the light of the ECHR. The court may therefore consider that the inclusion of a necessity test means that a controller would have to consider whether the UK-US agreement, or any other treaty added to the schedule, is proportionate to a legitimate aim pursued. Not only is it unreasonable to expect a controller to do such an assessment; it is also highly unusual. International treaties are drafted on a state-to-state basis and not in a way that necessarily corresponds clearly with domestic law. Further, domestic courts would normally consider the rights under the domestic law implementing a treaty, rather than having to interpret an international instrument without reference to a domestic implementing scheme. Being required to do so may make it more difficult for courts to enforce data subjects’ rights.
The Government have not really explained why it is necessary to amend the law in this way rather than simply implementing the UK-US agreement domestically. That would be the normal approach; it would remove the need to add this new legal basis and enable controllers to use the existing framework to identify a legal basis to process data in domestic law. Instead, this amendment makes it more difficult to understand how the law operates, which could in turn deter data sharing in important situations. Perhaps the Minister could explain why Clause 7 is there.
I beg to move.
My Lords, I rise to speak to Amendments 13 and 15. Before I do, let me say that I strongly support the comments of the noble Lord, Lord Clement-Jones, about HIV and the related vulnerability, and his assertion—almost—that Clause 5 is a solution in search of a problem. “Legitimate interest” is a flexible concept and I am somewhat bewildered as to why the Government are seeking to create change where none is needed. In this context, it follows that, were the noble Lord successful in his argument that Clause 5 should not stand part, Amendments 13 and 15 would be unnecessary.
On the first day in Committee, we debated a smaller group of amendments that sought to establish the principle that nothing in the Bill should lessen the privacy protections of children. In his response, the Minister said:
“if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data”.—[Official Report, 20/3/24; col. GC 75.]
I am glad the Minister is open to listening and that the Government’s intention is to protect children, but, as discussed previously, widening the definition of “research” in Clause 3 and watering down purpose limitation protections in Clause 6 negatively impacts children’s data rights. Again, in Clause 5, lowering the protections for all data subjects has consequences for children.
The balancing test remains there for legitimate interests, under Article 6(1)(f).
Amendment 16 seeks to prevent organisations that undertake third-party marketing relying on the legitimate interest lawful ground under Article 6(1)(f) of the UK GDPR. As I have set out, organisations can rely on that ground for processing personal data without consent when they are satisfied that they have a legitimate interest to do so and that their commercial interests are not outweighed by the rights and interests of data subjects.
Clause 5(4) inserts in Article 6 new paragraph (9), which provides some illustrative examples of activities that may constitute legitimate interests, including direct marketing activities, but it does not mean that they will necessarily be able to process personal data for that purpose. Organisations will need to assess on a case-by-case basis where the balance of interest lies. If the impact on the individual’s privacy is too great, they will not be able to rely on the legitimate interest lawful ground. I should emphasise that this is not a new concept created by this Bill. Indeed, the provisions inserted by Clause 5(4) are drawn directly from the recitals to the UK GDPR, as incorporated from the EU GDPR.
I recognise that direct marketing can be a sensitive—indeed, disagreeable—issue for some, but direct marketing information can be very important for businesses as well as individuals and can be dealt with in a way that respects people’s privacy. The provisions in this Bill do not change the fact that direct marketing activities must be compliant with the data protection and privacy legislation and continue to respect the data subject’s absolute right to opt out of receiving direct marketing communications.
Amendment 17 would make sure that the processing of employee data for “internal administrative purposes” is subject to heightened safeguards, particularly when it relates to health. I understand that this amendment relates to representations made by the National AIDS Trust concerning the level of protection afforded to employees’ health data. We agree that the protection of people’s HIV status is vital and that it is right that it is subject to extra protection, as is the case for all health data and special category data. We have committed to further engagement and to working with the National AIDS Trust to explore solutions in order to prevent data breaches of people’s HIV status, which we feel is best achieved through non-legislative means given the continued high data protection standards afforded by our existing legislation. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press this amendment.
Amendment 18 seeks to allow businesses more confidently to rely on the existing legitimate interest lawful ground for the transmission of personal data within a group of businesses affiliated by contract for internal administrative purposes. In Clause 5, the list of activities in proposed new paragraphs (9) and (10) are intended to be illustrative of the types of activities that may be legitimate interests for the purposes of Article 6(1)(f). They are focused on processing activities that are currently listed in the recitals to the EU GDPR but are simply examples. Many other processing activities may be legitimate interests for the purposes of Article 6(1)(f) of the UK GDPR. It is possible that the transmission of personal data for internal administrative purposes within a group affiliated by contract may constitute a legitimate interest, as may many other commercial activities. It would be for the controller to determine this on a case-by-case basis after carrying out a balancing test to assess the impact on the individual.
Finally, I turn to the clause stand part debate that seeks to remove Clause 7 from the Bill. I am grateful to the noble Lord, Lord Clement-Jones, for this amendment because it allows me to explain why this clause is important to the success of the UK-US data access agreement. As noble Lords will know, that agreement helps the law enforcement agencies in both countries tackle crime. Under the UK GDPR, data controllers can process personal data without consent on public interest grounds if the basis for the processing is set out in domestic law. Clause 7 makes it clear that the processing of personal data can also be carried out on public interest grounds if the basis for the processing is set out in a relevant international treaty such as the UK-US data access agreement.
The agreement permits telecommunications operators in the UK to disclose data about serious crimes with law enforcement agencies in the US, and vice versa. The DAA has been operational since October 2022 and disclosures made by UK organisations under it are already lawful under the UK GDPR. Recent ICO guidance confirms this, but the Government want to remove any doubt in the minds of UK data controllers that disclosures under the DAA are permitted by the UK GDPR. Clause 7 makes it absolutely clear to telecoms operators in the UK that disclosures under the DAA can be made in reliance on the UK GDPR’s public tasks processing grounds; the clause therefore contributes to the continued, effective functioning of the agreement and to keeping the public in both the UK and the US safe.
For these reasons, I hope that the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment.
My first reaction is “Phew”, my Lords. We are all having to keep to time limits now. The Minister did an admirable job within his limit.
I wholeheartedly support what the noble Baronesses, Lady Kidron and Lady Harding, said about Amendments 13 and 15 and what the noble Baroness, Lady Jones, said about her Amendment 12. I do not believe that we have yet got to the bottom of children’s data protection; there is still quite some way to go. It would be really helpful if the Minister could bring together the elements of children’s data about which he is trying to reassure us and write to us saying exactly what needs to be done, particularly in terms of direct marketing directed towards children. That is a real concern.
My Lords, it is a pleasure to follow the noble Baroness, Lady Harding and Lady Bennett, after the excellent introduction to the amendments in this group by the noble Baroness, Lady Jones. The noble Baroness, Lady Harding, used the word “trust”, and this is another example of a potential hidden agenda in the Bill. Again, it is destructive of any public trust in the way their data is curated. This is a particularly egregious example, without, fundamentally, any explanation. Sir John Whittingdale said that a future Government
“may want to encourage democratic engagement in the run up to an election by temporarily ‘switching off’ some of the direct marketing rules”.—[Official Report, Commons, 29/11/2023; col. 885.]
Nothing to see here—all very innocuous; but, as we know, in the past the ICO has been concerned about even the current rules on the use of data by political parties. It seems to me that, without being too Pollyannaish about this, we should be setting an example in the way we use the public’s data for campaigning. The ICO, understandably, is quoted as saying during the public consultation on the Bill that this is
“an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.
That seems an understatement, but that is how regulators talk. It is entirely right to be concerned about these provisions.
Of course, they are hugely problematic, but they are particularly problematic given that it is envisaged that young people aged 14 and older should be able to be targeted by political parties when they cannot even vote, as we have heard. This would appear to contravene one of the basic principles of data protection law: that you should not process more personal data than you need for your purposes. If an individual cannot vote, it is hard to see how targeting them with material relating to an election is a proportionate interference with their privacy rights, particularly when they are a child. The question is, should we be soliciting support from 14 to 17 year-olds during elections when they do not have votes? Why do the rules need changing so that people can be targeted online without having consented? One of the consequences of these changes would be to allow a Government to switch off—the words used by Sir John Whittingdale—direct marketing rules in the run-up to an election, allowing candidates and parties to rely on “soft” opt-in to process data and make other changes without scrutiny.
Exactly as the noble Baroness, Lady Jones, said, respondents to the original consultation on the Bill wanted political communications to be covered by existing rules on direct marketing. Responses were very mixed on the soft opt-in, and there were worries that people might be encouraged to part with more of their personal data. More broadly, why are the Government changing the rules on democratic engagement if they say they will not use these powers? What assessment have they made of the impact of the use of the powers? Why are the powers not being overseen by the Electoral Commission? If anybody is going to have the power to introduce the ability to market directly to voters, it should be the Electoral Commission.
All this smacks of taking advantage of financial asymmetry. We talked about competition asymmetry with big tech when we debated the digital markets Bill; similarly, this seems a rather sneaky way of taking advantage of the financial resources one party might have versus others. It would allow it to do things other parties cannot, because it has granted itself permission to do that. The provisions should not be in the hands of any Secretary of State or governing party; if anything, they should be in entirely independent hands; but, even then, they are undesirable.
My Lords, I thank the noble Baroness, Lady Jones, for tabling her amendments. Amendment 19 would remove processing which is necessary for the purposes of democratic engagement from the list of recognised legitimate interests. It is essential in a healthy democracy that registered political parties, elected representatives and permitted participants in referendums can engage freely with the electorate without being impeded unnecessarily by data protection legislation.
The provisions in the Bill will mean that these individuals and organisations do not have to carry out legitimate interest assessments or look for a separate legal basis. They will, however, still need to comply with other requirements of data protection legislation, such as the data protection principles and the requirement for processing to be necessary.
On the question posed by the noble Baroness about the term “democratic engagement”, it is intended to cover a wide range of political activities inside and outside election periods. These include but are not limited to democratic representation; communicating with electors and interested parties; surveying and opinion gathering; campaigning activities; activities to increase voter turnout; supporting the work of elected representatives, prospective candidates and official candidates; and fundraising to support any of these activities. This is reflected in the drafting, which incorporates these concepts in the definition of democratic engagement and democratic engagement activities.
The ICO already has guidance on the use of personal data by political parties for campaigning purposes, which the Government anticipate it will update to reflect the changes in the Bill. We will of course work with the ICO to make sure it is familiar with our plans for commencement and that it does not benefit any party over another.
On the point made about the appropriate age for the provisions, in some parts of the UK the voting age is 16 for some elections, and children can join the electoral register as attainers at 14. The age of 14 reflects the variations in voting age across the nation; in some parts of the UK, such as Scotland, a person can register to vote at 14 as an attainer. An attainer is someone who is registered to vote in advance of their being able to do so, to allow them to be on the electoral roll as soon as they turn the required age. Children aged 14 and over are often politically engaged and are approaching voting age. The Government consider it important that political parties and elected representatives can engage freely with this age group—
May I make a suggestion to my noble friend the Minister? It might be worth asking the legal people to get the right wording, but if there are different ages at which people can vote in different parts of the United Kingdom, surely it would be easier just to relate it to the age at which they are able to vote in those elections. That would address a lot of the concerns that many noble Lords are expressing here today.
My Lords, this whole area of democratic engagement is one that the Minister will need to explain in some detail. This is an Alice in Wonderland schedule: “These words mean what I want them to mean”. If, for instance, you are engaging with the children of a voter—at 14, they are children—is that democratic engagement? You could drive a coach and horses through Schedule 1. The Minister used the word “necessary”, but he must give us rather more than that. It was not very reassuring.
The Minister mentioned a presumption that the ICO will update its guidance. Is there a timeframe for that? Will the guidance be updated before this comes into effect? How does the age of 14 relate to the AADC, which sets the age of adulthood at 18?
My Lords, I know that these amendments were said to be technical amendments, so I thought I would just accept them, but when I saw the wording of Amendment 283 some alarm bells started ringing. It says:
“The Commission may do anything it thinks appropriate for the purposes of, or in connection with, its functions”.
I know that the Minister said that this is stating what the commission is already able to do, but I am concerned whenever I see those words anywhere. They give a blank cheque to any authority or organisation.
Many noble Lords will know that I have previously spoken about the principal-agent theory in politics, in which certain powers are delegated to an agency or regulator, but what accountability does it have? I worry when I see that it “may do anything … appropriate” to fulfil its tasks. I would like some assurance from the Minister that there is a limit to what the information commission can do and some accountability. At a time when many of us are asking who regulates the regulators and when we are looking at some of the arm’s-length bodies—need I mention the Post Office?—there is some real concern about accountability.
I understand the reason for wanting to clarify or formalise what the Minister believes the information commission is doing already, but I worry about this form of words. I would like some reassurance that it is not wide-ranging and that there is some limit and accountability to future Governments. I have seen this sentiment across the House; people are asking who regulates the regulators and to whom are they accountable.
My Lords, I must congratulate the noble Lord, Lord Kamall. Amid a blizzard of technical and minor amendments from the Minister, he forensically spotted one to raise in that way. He is absolutely right. The Industry and Regulators Committee has certainly been examining the accountability and scrutiny devoted to regulators, so we need to be careful in the language that we use. I think we have to take a lot on trust from the Minister, particularly in Grand Committee.
I apparently failed to declare an interest at Second Reading. I forgot to state that I am a consultant to DLA Piper and the Whips have reminded me today that I failed to do so on the first day in Committee, so I apologise to the Committee for that. I am not quite sure why my consultancy with DLA Piper is relevant to the data protection Bill, but there it is. I declare it.
My Lords, it is a pleasure to follow the noble Lord, Lord Sikka. He raised even more questions about Clause 9 than I ever dreamed of. He has illustrated the real issues behind the clause and why it is so important to debate its standing part, because, in our view, it should certainly be removed from the Bill. It would seriously limit people’s ability to access information about how their personal data is collected and used. We are back to the dilution of data subject rights, within which the rights of data subject access are, of course, vital. This includes limiting access to information about automated decision-making processes to which people are subject.
A data subject is someone who can be identified directly or indirectly by personal data, such as a name, an ID number, location data, or information relating to their physical, economic, cultural or social identity. Under existing law, data subjects have a right to request confirmation of whether their personal data is being processed by a controller, to access that personal data and to obtain information about how it is being processed. The noble Lord, Lord Sikka, pointed out that there is ample precedent for how the controller can refuse a request from a data subject only if it is manifestly unfounded or excessive. The meaning of that phrase is well established.
There are three main ways in which Clause 9 limits people’s ability to access information about how their personal data is being collected and used. First, it would lower the threshold for refusing a request from “manifestly unfounded or excessive” to “vexatious or excessive”. This is an inappropriately low threshold, given the nature of a data subject access request—namely, a request by an individual for their own data.
Secondly, Clause 9 would insert a new mandatory list of considerations for deciding whether the request is vexatious or excessive. This includes vague considerations, such as
“the relationship between the person making the request (the ‘sender’) and the person receiving it (the ‘recipient’)”.
The very fact that the recipient holds data relating to the sender means that there is already some form of relationship between them.
Thirdly, the weakening of an individual’s right to obtain information about how their data is being collected, used or shared is particularly troubling given the simultaneous effect of the provisions in Clause 10, which means that data subjects are less likely to be informed about how their data is being used for additional purposes other than those for which it was originally collected, in cases where the additional purposes are for scientific or historical research, archiving in the public interest or statistical purposes. Together, the two clauses mean that an individual is less likely to be proactively told how their data is being used, while it is harder to access information about their data when requested.
In the Public Bill Committee in the House of Commons, the Minister, Sir John Whittingdale, claimed that:
“The new parameters are not intended to be reasons for refusal”,
but rather to give
“greater clarity than there has previously been”.—[Official Report, Commons, Data Protection and Digital Information Bill Committee, 16/5/23; cols. 113-14.]
But it was pointed out by Dr Jeni Tennison of Connected by Data in her oral evidence to the committee that the impact assessment for the Bill indicates that a significant proportion of the savings predicted would come from lighter burdens on organisations dealing with subject access requests as a result of this clause. This suggests that, while the Government claim that this clause is a clarification, it is intended to weaken obligations on controllers and, correspondingly, the rights of data subjects. Is that where the Secretary of State’s £10 billion of benefit from this Bill comes from? On these grounds alone, Clause 9 should be removed from the Bill.
We also oppose the question that Clause 12 stand part of the Bill. Clause 12 provides that, in responding to subject access requests, controllers are required only to undertake a
“reasonable and proportionate search for the personal data and other information”.
This clause also appears designed to weaken the right of subject access and will lead to confusion for organisations about what constitutes a reasonable and proportionate search in a particular circumstance. The right of subject access is central to individuals’ fundamental rights and freedoms, because it is a gateway to exercising other rights, either within the data subject rights regime or in relation to other legal rights, such as the rights to equality and non-discrimination. Again, the lowering of rights compared with the EU creates obvious risks, and this is a continuing theme of data adequacy.
Clause 12 does not provide a definition for reasonable and proportionate searches, but when introducing the amendment, Sir John Whittingdale suggested that a search for information may become unreasonable or disproportionate
“when the information is of low importance or of low relevance to the data subject”.—[Official Report, Commons, 29/11/23; col. 873.]
Those considerations diverge from those provided in the Information Commissioner’s guidance on the rights of access, which states that when determining whether searches may be unreasonable or disproportionate, the data controller must consider the circumstances of the request, any difficulties involved in finding the information and the fundamental nature of the right of access.
We also continue to be concerned about the impact assessment for the Bill and the Government’s claims that the new provisions in relation to subject access requests are for clarification only. Again, Clause 12 appears to have the same impact as Clause 9 in the kinds of savings that the Government seem to imagine will emerge from the lowering of subject access rights. This is a clear dilution of subject access rights, and this clause should also be removed from the Bill.
We always allow for belt and braces and if our urging does not lead to the Minister agreeing to remove Clauses 9 and 12, at the very least we should have the new provisions set out either in Amendment 26, in the name of the noble Baroness, Lady Jones of Whitchurch, or in Amendment 25, which proposes that a data controller who refuses a subject access request must give reasons for their refusal and tell the subject about their right to seek a remedy. That is absolutely the bare minimum, but I would far prefer to see the deletion of Clauses 9 and 12 from the Bill.
As ever, I thank noble Lords for raising and speaking to these amendments. I start with the stand part notices on Clauses 9 and 36, introduced by the noble Lord, Lord Clement-Jones. Clauses 9 and 36 clarify the new threshold to refuse or charge a reasonable fee for a request that is “vexatious or excessive”. Clause 36 also clarifies that the Information Commissioner may charge a fee for dealing with, or refuse to deal with, a vexatious or excessive request made by any persons and not just data subjects, providing necessary certainty.
From looking at the wording of the Members’ explanatory statements for wishing to leave out Clauses 9 and 36, I do not think that the Minister has addressed this, but does he accept that the Bill now provides a more lax approach? Is this a reduction of the standard expected? To me, “vexatious or excessive” sounds very different from “manifestly unfounded or excessive”. Does he accept that basic premise? That is really the core of the debate; if it is not, we have to look again at the issue of resources, which seems to be the argument to make this change.
If that is the case and this is a dilution, is this where the Government think they will get the savings identified in the impact assessment? It was alleged in the Public Bill Committee that this is where a lot of the savings would come from—we all have rather different views. My first information was that every SME might save about £80 a year then, suddenly, the Secretary of State started talking about £10 billion of benefit from the Bill. Clarification of that would be extremely helpful. There seems to be a dichotomy between the noble Lord, Lord Bassam, saying that this is a way to reduce the burdens on business and the Minister saying that it is all about confident refusal and confidence. He has used that word twice, which is worrying.
I apologise for intervening, but the Minister referred to resources. By that, he means the resources for the controller but, as I said earlier, there is no consideration of what the social cost may be. If this Bill had already become law, how would the victims of the Post Office scandal have been able to secure any information? Under this Bill, the threshold for providing information will be much lower than it is under the current legislation. Can the Minister say something about how the controllers will take social cost into account or how the Government have taken that into account?
The actual application of the terms will be set out in guidance by the ICO but the intention is to filter out the more disruptive and cynical ones. Designing these words is never an easy thing but there has been considerable consultation on this in order to achieve that intention.
My Lords—sorry; it may be that the Minister was just about to answer my question. I will let him do so.
I will have to go back to the impact assessment but I would be astonished if that was a significant part of the savings promised. By the way, the £10.6 billion—or whatever it is—in savings was given a green rating by the body that assesses these things; its name eludes me. It is a robust calculation. I will check and write to the noble Lord, but I do not believe that a significant part of that calculation leans on the difference between “vexatious” and “manifestly unfounded”.
It would be very useful to have the Minister respond on that but, of course, as far as the impact assessment is concerned, a lot of this depends on the Government’s own estimates of what this Bill will produce—some of which are somewhat optimistic.
My Lords, can we join in with the request to see that information in a letter? We would like to see where these savings will be made and how much will, as noble Lords have said, be affected by the clauses that we are debating today.
The noble Baroness, Lady Jones, has given me an idea: if an impact assessment has been made, clause by clause, it would be extremely interesting to know just where the Government believe the golden goose is.
I am not quite sure what is being requested because the impact assessment has been not only made but published.
I see—so noble Lords would like an analysis of the different components of the impact assessment. It has been green-rated by the independent Regulatory Policy Committee. I have just been informed by the Box that the savings from these reforms to the wording of SARs are valued at less than 1% of the benefit of more than £10 billion that this Bill will bring.
That begs the question of where on earth the rest is coming from.
Which I will be delighted to answer. With this interesting exchange, I have lost in my mind the specific questions that the noble Lord, Lord Sikka, asked but I am coming on to some of his other ones; if I do not give satisfactory answers, no doubt he will intervene and ask again.
I appreciate the further comments made by the noble Lord, Lord Sikka, about the Freedom of Information Act. I hope he will be relieved to know that this Bill does nothing to amend that Act. On his accounting questions, he will be aware that most SARs are made by private individuals to private companies. The Government are therefore not involved in that process and do not collect the kind of information that he described.
Following the DPDI Bill, the Government will work with the ICO to update guidance on subject access requests. Guidance plays an important role in clarifying what a controller should consider when relying on the new “vexatious or excessive” provision. The Government are also exploring whether a code of practice on subject access requests can best address the needs of controllers and data subjects.
On whether Clause 12 should stand part of the Bill, Clause 12 is only putting on a statutory footing what has already been established—
My Lords, I support Amendments 27 to 34, tabled variously by my noble friend Lady Harding, and the noble Lord, Lord Clement-Jones, to which I have added my name. As this is the first time I have spoken in Committee, I declare my interests as deputy chairman of the Telegraph Media Group and president of the Institute of Promotional Marketing and note my other declarations in the register.
The direct marketing industry is right at the heart of the data-driven economy, which is crucial not just to the future of the media and communications industries but to the whole basis of the creative economy, which will power economic growth into the future. The industry has quite rightly welcomed the Bill, which provides a long-term framework for economic growth as well as protecting customers.
However, there is one area of great significance, as my noble friend Lady Harding has just eloquently set out, on which this Bill needs to provide clarity and certainty going forward, namely, the use of the open electoral register. That register is an essential resource for a huge number of businesses and brands, as well as many public services, as they try to build new audiences. As we have heard, it is now in doubt because of a recent legal ruling that could, as my noble friend said, lead to people being bombarded with letters telling them that their data on the OER has been used. That is wholly disproportionate and is not in the interests of the marketing and communications industry or customers.
These sensible amendments would simply confirm the status quo that has worked well for so long. They address the issue by providing legal certainty around the use of the OER. I believe they do so in a proportionate manner that does not in any way compromise any aspect of the data privacy of UK citizens. I urge the Minister carefully to consider these amendments. As my noble friend said, there are considerable consequences of not acting for the creative economy, jobs in direct marketing, consumers, the environment and small businesses.
My Lords, I am extremely grateful to the noble Baroness, Lady Harding, and the noble Lord, Lord Black, for doing all the heavy lifting on these amendments. I of course support them having put forward my own amendments. It is just the luck of the draw that the noble Baroness, Lady Harding, put forward her amendment along with all the others. I have very little to say in this case, and just echo what the noble Lord, Lord Black, said about the fact that the open electoral register has played an important part in the direct marketing, data-driven economy, as it is described. It is particularly interesting that he mentioned the creative industries as well.
The First-tier Tribunal precedent could impact on other public sources of data, including the register of companies, the register of judgments, orders and fines, the land register and the food standards agency register. It could have quite far-reaching implications unless we manage to resolve the issue. There is a very tight timescale. The First-tier Tribunal’s ruling means that companies must notify those on the electoral register by 20 May or be at risk of breaching the law. This is really the best route for trying to resolve the issue. Secondly, the First-tier Tribunal’s ruling states that costs cannot be considered as disproportionate effort. That is why these amendments explicitly refer to that. This is no trivial matter. It is a serious area that needs curing by this Bill, which is a good opportunity to do so.
I shall speak briefly to Clause 11 as a whole standing part. That may seem a bit paradoxical, but it is designed to address issues arising in Article 13, not Article 14. Article 13 of the UK GDPR requires controllers, where they intend to process data that was collected directly from data subjects—as opposed to Article 14 obligations, which apply to personal data not obtained from the data subject—for a new purpose, to inform data subjects of various matters to the extent necessary,
“to ensure fair and transparent processing”.
Clause 11(1) removes this obligation for certain purposes where it would require disproportionate effort. The obligation is already qualified to what is necessary to make processing fair and transparent, the fundamental requirements of the GDPR. If, in these circumstances, processing cannot be made fair and transparent without disproportionate effort, then it should not take place. Clause 11(1) would sidestep the requirement and allow unfair, untransparent processing to go ahead for personal data that the data controllers had themselves collected. Perhaps I should have tabled a rather more targeted amendment, but I hope that noble Lords get the point of the difference between this in terms of Article 13 and Article 14.
On the first point, I used the words carefully because the Government cannot instruct the ICO specifically on how to act in any of these cases. The question about the May deadline is important. With the best will in the world, none of the provisions in the Bill are likely to be in effect by the time of that deadline in any case. That being the case, I would feel slightly uneasy about advising the ICO on how to act.
My Lords, I am not quite getting from the Minister whether he has an understanding of and sympathy with the case that is being made or whether he is standing on ceremony on its legalities. Is he saying, “No, we think that would be going too far”, or that there is a good case and that guidance or some action by the ICO would be more appropriate? I do not get the feeling that somebody has made a decision about the policy on this. It may be that conversations with the Minister between Committee and Report would be useful, and it may be early days yet until he hears the arguments made in Committee; I do not know, but it would be useful to get an indication from him.
Yes. I repeat that I very much recognise the seriousness of the case. There is a balance to be drawn here. In my view, the best way to identify the most appropriate balancing point is to continue to work closely with the ICO, because I strongly suspect that, at least at this stage, it may be very difficult to draw a legislative dividing line that balances the conflicting needs. That said, I am happy to continue to engage with noble Lords on this really important issue between Committee and Report, and I commit to doing so.
On the question of whether Clause 11 should stand part of the Bill, Clause 11 extends the existing disproportionate effort exemption to cases where the controller collected the personal data directly from the data subject and intends to carry out further processing for research purposes, subject to the research safeguards outlined in Clause 26. This exemption is important to ensure that life-saving research can continue unimpeded.
Research holds a privileged position in the data protection framework because, by its nature, it is viewed as generally being in the public interest. The framework has various exemptions in place to facilitate and encourage research in the UK. During the consultation, we were informed of various longitudinal studies, such as those into degenerative neurological conditions, where it is impossible or nearly impossible to recontact data subjects. To ensure that this vital research can continue unimpeded, Clause 11 provides a limited exemption that applies only to researchers who are complying with the safeguards set out in Clause 26.
The noble Lord, Lord Clement-Jones, raised concerns that Clause 11 would allow unfair processing. I assure him that this is not the case, as any processing that uses the disproportionate effort exemption in Article 13 must comply with the overarching data protection principles, including lawfulness, fairness and transparency, so that even if data controllers rely on this exemption they should consider other ways to make the processing they undertake as fair and transparent as possible.
Finally, returning to EU data adequacy, the Government recognise its importance and, as I said earlier, are confident that the proposals in Clause 11 are complemented by robust safeguards, which reinforces our view that they are compatible with EU adequacy. For the reasons that I have set out, I am unable to accept these amendments, and I hope that noble Lords will not press them.
My Lords, the number of amendments proposed to Clause 14 reflects the Committee’s very real concern about the impact of automated decision-making on the privacy, safety and prospects of UK data subjects. I have specific amendments in groups 7 and 8, so I will speak to the impact of Clause 14 on children later. I will again be making arguments about the vulnerability of these systems in relation to the Government’s proposals on the DWP.
Without repeating the arguments made, I associate myself with most the proposals and the intention behind them—the need to safeguard the prospects of a fair outcome when algorithms hold sway over a person’s future. It seems entirely logical that, if the definition of solely automated decision-making requires “no meaningful human involvement”, we should be clear, as Amendment 40 proposes, about what is considered “meaningful”, so that the system cannot be gamed by providing human involvement that provides an ineffective safeguard and is therefore not meaningful.
I have sympathy with many of these amendments—Amendments 38A, 39, 47, 62, 64 and 109—and ultimately believe, as was suggested by the noble Lord, Lord Bassam, that it is a matter of trust. I refer briefly to the parliamentary briefing from the BMA, which boldly says that:
“Clause 14 risks eroding trust in AI”.
That would be a very sad outcome.
My Lords, we have heard some powerful concerns on this group already. This clause is in one of the most significant parts of the Bill for the future. The Government’s AI policy is of long standing. They started it many years ago, then had a National AI Strategy in 2021, followed by a road map, a White Paper and a consultation response to the White Paper. Yet this part of the Bill, which is overtly about artificial intelligence and automated decision-making, does not seem to be woven into their thinking at all.
As ever, I thank the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, for their detailed consideration of Clause 14, and all other noble Lord who spoke so well. I carefully note the references to the DWP’s measure on fraud and error. For now, I reassure noble Lords that a human will always be involved in all decision-making relating to that measure, but I note that this Committee will have a further debate specifically on that measure later.
The Government recognise the importance of solely automated decision-making to the UK’s future success and productivity. These reforms ensure that it can be responsibly implemented, while any such decisions with legal or similarly significant effects have the appropriate safeguards in place, including the rights to request a review and to request one from a human. These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles. In doing so, they will provide confidence to organisations looking to use these technologies in a responsible way while driving economic growth and innovation.
The Government also recognise that AI presents huge opportunities for the public sector. It is important that AI is used responsibly and transparently in the public sector; we are already taking steps to build trust and transparency. Following a successful pilot, we are making the Algorithmic Transparency Reporting Standard—the ATRS—a requirement for all government departments, with plans to expand this across the broader public sector over time. This will ensure that there is a standardised way for government departments proactively to publish information about how and why they are using algorithms in their decision-making. In addition, the Central Digital and Data Office—the CDDO—has already published guidance on the procurement and use of generative AI for the UK Government and, later this year, DSIT will launch the AI management essentials scheme, setting a minimum good practice standard for companies selling AI products and services.
My Lords, could I just interrupt the Minister? It may be that he can get an answer from the Box to my question. One intriguing aspect is that, as the Minister said, the pledge is to bring the algorithmic recording standard into each government department and there will be an obligation to use that standard. However, what compliance mechanism will there be to ensure that that is happening? Does the accountable Permanent Secretary have a duty to make sure that that is embedded in the department? Who has the responsibility for that?
That is a fair question. I must confess that I do not know the answer. There will be mechanisms in place, department by department, I imagine, but one would also need to report on it across government. Either it will magically appear in my answer or I will write to the Committee.
The CDDO has already published guidance on the procurement and use of generative AI for the Government. We will consult on introducing this as a mandatory requirement for public sector procurement, using purchasing power to drive responsible innovation in the broader economy.
I turn to the amendments in relation to meaningful involvement. I will first take together Amendments 36 and 37, which aim to clarify that the safeguards mentioned under Clause 14 are applicable to profiling operations. New Article 22A(2) already clearly sets out that, in cases where profiling activity has formed part of the decision-making process, controllers have to consider the extent to which a decision about an individual has been taken by means of profiling when establishing whether human involvement has been meaningful. Clause 14 makes clear that a solely automated significant decision is one without meaningful human involvement and that, in these cases, controllers are required to provide the safeguards in new Article 22C. As such, we do not believe that these amendments are necessary; I therefore ask the noble Baroness, Lady Jones, not to press them.
Turning to Amendment 38, the Government are confident that the existing reference to “data subject” already captures the intent of this amendment. The existing definition of “personal data” makes it clear that a data subject is a person who can be identified, directly or indirectly. As such, we do not believe that this amendment is necessary; I ask the noble Lord, Lord Clement-Jones, whether he would be willing not to press it.
Amendments 38A and 40 seek to clarify that, for human involvement to be considered meaningful, the review must be carried out by a competent person. We feel that these amendments are unnecessary as meaningful human involvement may vary depending on the use case and context. The reformed clause already introduces a power for the Secretary of State to provide legal clarity on what is or is not to be taken as meaningful human involvement. This power is subject to the affirmative procedure in Parliament and allows the provision to be future-proofed in the wake of technological advances. As such, I ask the noble Baronesses, Lady Jones and Lady Bennett, not to press their amendments.
I am not sure I agree with that characterisation. The ATRS is a relatively new development. It needs time to bed in and needs to be bedded in on an agile basis in order to ensure not only quality but speed of implementation. That said, I ask the noble Lord to withdraw his amendment.
The Minister has taken us through what Clause 14 does and rebutted the need for anything other than “solely”. He has gone through the sensitive data and the special category data aspects, and so on, but is he reiterating his view that this clause is purely for clarification; or is he saying that it allows greater use of automated decision-making, in particular in public services, so that greater efficiencies can be found and therefore it is freeing up the public sector at the expense of the rights of the individual? Where does he sit in all this?
As I said, the intent of the Government is: yes to more automated data processing to take advantage of emerging technologies, but also yes to maintaining appropriate safeguards. The safeguards in the present system consist—if I may characterise it in a slightly blunt way—of providing quite a lot of uncertainty, so that people do not take the decision to positively embrace the technology in a safe way. By bringing in this clarity, we will see an increase not only in the safety of their applications but in their use, driving up productivity in both the public and private sectors.
My Lords, the amendments in this group highlight that Clause 14 lacks the necessary checks and balances to uphold equality legislation, individual rights and freedoms, data protection rights, access to services, fairness in the exercise of public functions and workers’ rights. I add my voice to that of the noble Lord, Lord Clement-Jones, in his attempt to make Clause 14 not stand part, which he will speak to in the next group.
I note, as the noble Lord, Lord Bassam, has, that all the current frameworks have fundamental rights at their heart, whether it is the White House blueprint, the UN Secretary-General’s advisory body on AI, with which I am currently involved, or the EU’s AI Act. I am concerned that the UK does not want to work within this consensus.
With that in mind, I particularly note the importance of Amendment 41. As the noble Lord said, we are all supposed to adhere to the Equality Act 2010. I support Amendments 48 and 49, which are virtually inter-changeable in wanting to ensure that the standard of decisions being “solely” based on automated decision-making cannot be gamed by adding a trivial human element to avoid that designation.
Again, I suggest that the Government cannot have it both ways—with nothing diminished but everything liberated and changed—so I find myself in agreement with Amendment 52A and Amendment 59A, which is in the next group, from the noble Lord, Lord Holmes, who is not in his place. These seek clarity from the Information Commissioner.
I turn to my Amendment 46. My sole concern is to minimise the impact of Clause 14 on children’s safety, privacy and life chances. The amendment provides that a significant decision about a data subject must not be based solely on automated processing if
“the data subject is a child or may be a child unless the provider is satisfied that the decision is in, and compatible with, the best interests of a child”,
taking into account the full gamut of their rights and development stage. Children have enhanced rights under the UNCRC, to which the UK is a signatory. Due to their evolving capacities as they make the journey from infancy to adulthood, they need special protections. If their rights are diminished in the digital world, their rights are diminished full stop. Algorithms determine almost every aspect of a child’s digital experience, from the videos they watch to their social network and from the sums they are asked to do in their maths homework to the team they are assigned when gaming. We have seen young boys wrongly profiled as criminal and girls wrongly associated with gangs.
In a later group, I will speak to a proposal for a code of practice on children and AI, which would codify standards and expectations for the use of AI in all aspects of children’s lives, but for now, I hope the Minister will see that, without these amendments to automated decision-making, children’s data protection will be clearly weakened. I hope he will agree to act to make true his earlier assertion that nothing in the Bill will undermine child protection. The Minister is the Minister for AI. He knows the impact this will have. I understand that, right now, he will probably stick to the brief, but I ask him to go away, consider this from the perspective of children and parents, and ask, “Is it okay for children’s life chances to be automated in this fashion?”
My Lords, I will speak to my Amendment 48. By some quirk of fate, I failed to sign up to the amendments that the noble Lord, Lord Bassam, so cogently introduced. I would have signed up if I had realised that I had not, so to speak.
It is a pleasure to follow the noble Baroness, Lady Kidron. She has a track record of being extremely persuasive, so I hope the Minister pays heed in what happens between Committee and Report. I very much hope that there will be some room for manoeuvre and that there is not just permanent push-back, with the Minister saying that everything is about clarifying and us saying that everything is about dilution. There comes a point when we have to find some accommodation on some of these areas.
Amendments 48 and 49 are very similar—I was going to say, “Great minds think alike”, but I am not sure that my brain feels like much of a great mind at the moment. “Partly” or “predominantly” rather than “solely”, if you look at it the other way round, is really the crux of what I think many of us are concerned about. It is easy to avoid the terms of Article 22 just by slipping in some sort of token human involvement. Defining “meaningful” is so difficult in these circumstances. I am concerned that we are opening the door to something that could be avoided. Even then, the terms of the new clause—we will have a clause stand part debate on Wednesday, obviously—put all the onus on the data subject, whereas that was not the case previously under Article 22. The Minister has not really explained why that change has been made.
I conclude by saying that I very much support Amendment 41. This whole suite of amendments is well drafted. The point about the Equality Act is extremely well made. The noble Lord, Lord Holmes, also has a very good amendment here. It seems to me that involving the ICO right in the middle of this will be absolutely crucial—and we are back to public trust again. If nothing else, I would like explicitly to include that under Clause 14 in relation to Article 22 by the time this Bill goes through.
I thank noble Lords and the noble Baroness for their further detailed consideration of Clause 14.
Let me take first the amendments that deal with restrictions on and safeguards for ADM and degree of ADM. Amendment 41 aims to make clear that solely automated decisions that contravene any part of the Equality Act 2010 are prohibited. We feel that this amendment is unnecessary for two reasons. First, this is already the case under the Equality Act, which is reinforced by the lawfulness principle under the present data protection framework, meaning that controllers are already required to adhere to the Equality Act 2010. Secondly, explicitly stating in the legislation that contravening one type of legislation is prohibited—in this case, the Equality Act 2010—and not referring to other legislation that is also prohibited will lead to an inconsistent approach. As such, we do not believe that this amendment is necessary; I ask the noble Baroness, Lady Jones, to withdraw it.
Amendment 44 seeks to limit the conditions for special category data processing for this type of automated decision-making. Again, we feel that this is not needed given that a set of conditions already provides enhanced levels of protection for the processing of special category data, as set out in Article 9 of the UK GDPR. In order to lawfully process special category data, you must identify both a lawful basis under Article 6 of the UK GDPR and a separate condition for processing under Article 9. Furthermore, where an organisation seeks to process special category data under solely automated decision-making on the basis that it is necessary for contract, in addition to the Articles 6 and 9 lawful bases, they would also have to demonstrate that the processing was necessary for substantial public interest.
Similarly, Amendment 45 seeks to apply safeguards when processing special category data; however, these are not needed as the safeguards in new Article 22C already apply to all forms of processing, including the processing of special category data, by providing sufficient safeguards for data subjects’ rights, freedoms and legitimate interests. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, not to press them.
It may be either the controller or the processor but for any legal or similarly significant decision right now—today—there is a requirement before the Bill comes into effect. That requirement is retained by the Bill.
In line with ICO guidance, children need particular protection when organisations collect and process their personal data because they may be less aware of the risks involved. If organisations process children’s personal data they should think about the need to protect them from the outset and should design their systems and processes with this in mind. This is the case for organisations processing children’s data during solely automated decision-making, just as it is for all processing of children’s data.
Building on this, the Government’s view is that automated decision-making has an important role to play in protecting children online, for example with online content moderation. The current provisions in the Bill will help online service providers understand how they can use these technologies and strike the right balance between enabling the best use of automated decision-making technology while continuing to protect the rights of data subjects, including children. As such, we do not believe that the amendment is necessary; I ask the noble Baroness if she would be willing not to press it.
Amendments 48 and 49 seek to extend the Article 22 provisions to “predominantly” and “partly” automated decision-making. These types of processing already involve meaningful human involvement. In such instances, other data protection requirements, including transparency and fairness, continue to apply and offer relevant protections. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, if they would be willing not to press them.
Amendment 50 seeks to ensure that the Article 22C safeguards will apply alongside, rather than instead of, the transparency obligations in the UK GDPR. I assure the noble Baroness, Lady Jones, that the general transparency obligations in Articles 12 to 15 will continue to apply and thus will operate alongside the safeguards in the reformed Article 22. As such, we do not believe that this amendment is necessary; I ask the noble Baroness if she would be willing not to press it.
The changes proposed by Amendment 52A are unnecessary as Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons that the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22. Also, any changes to the regulations are subject to the affirmative procedure so must be approved by both Houses of Parliament. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, we do not believe that this amendment is necessary and, if he were here, I would ask my noble friend Lord Holmes if he would be willing not to press it.
Amendments 98A and 104A are related to workplace rights. Existing data protection legislation and our proposed reforms provide sufficient safeguards for automated decision making where personal data is being processed, including in workplaces. The UK’s human rights law, and existing employment and equality laws, also ensure that employees are informed and consulted about any workplace developments, which means that surveillance of employees is regulated. As such, we do not believe that these amendments are necessary and I ask the noble Baroness not to move them.
I hear what the Minister said about the workplace algorithmic assessment. However, if the Government believe it is right to have something like an algorithmic recording standard in the public sector, why is it not appropriate to have something equivalent in the private sector?
I would not say it is not right, but if we want to make the ATRS a standard, we should make it a standard in the public sector first and then allow it to be adopted as a means for all private organisations using ADM and AI to meet the transparency principles that they are required to adopt.
So would the Minister not be averse to it? It is merely so that the public sector is ahead of the game, allowing it to show the way and then there may be a little bit of regulation for the private sector.
I am not philosophically averse to such regulation. As to implementing it in the immediate future, however, I have my doubts about that possibility.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Science, Innovation & Technology
(9 months ago)
Grand CommitteeMy Lords, once more into the trenches we go before Easter. In moving Amendment 53, I will also speak to Amendments 54, 55, 57, 69, 70, 71 and 72 and the Clause 14 stand part notice.
The Bill contains a number of wide delegated powers, giving the Secretary of State the power to amend the UK GDPR via statutory instrument. The Government have said that the UK GDPR’s key elements remain sound and that they want to continue to offer a high level of protection for the public’s data, but that is no guarantee against significant reforms being brought in through a process that eludes full parliamentary scrutiny through primary legislation. Proposed changes to the UK GDPR should be contained in the Bill, where they can be debated and scrutinised properly via the primary legislation process. As it stands, key provisions of the UK GDPR can subsequently be amended via statutory instrument, which, in this case, is an inappropriate legislative process that affords much less scrutiny and debate, if debates are held at all.
The UK GDPR treats a solely automated decision as one without “meaningful human involvement”. The public are protected from being subject to solely automated decision-making where the decision has a legal or “similarly significant effect”. Clause 14(1) inserts new Article 22D(1) into the UK GDPR, which allows the Secretary of State to make regulations that deem a decision to have involved “meaningful human involvement”, even if there was no active review by a human decision-maker. New Article 22D(2) similarly allows the Secretary of State to make regulations to determine whether a decision made had a “similarly significant effect” to a legal effect. For example, in summer 2021 there was the A-level algorithm grading scandal. If something like that were to reoccur, under this new power a Minister could lay regulations stating that the decision to use an algorithm in grading A-levels was not a decision with a “similarly significant effect”.
New Article 22D(4) also allows the Secretary of State to add or remove, via regulations, any of the listed safeguards for automated decision-making. If the Government wish to amend or remove safeguards on automated decision-making, that should also be specified in the Bill and not left to delegated legislation. Amendments 53 to 55 and 69 to 72 would limit the Secretary of State’s power, so that they may add safeguards but cannot vary or remove those in the new Article 22D, as they stand, when the legislation comes into force.
If the clause is to be retained, we support Amendment 59A in the name of the noble Lord, Lord Holmes, which requires the Information Commissioner’s Office to develop guidance on the interpretation of the safeguards in new Article 22C and on important terms such as “similarly significant effect” and “meaningful human involvement”. It is within the Information Commissioner’s Office’s duties to issue guidance and to harmonise the interpretation of the law. As the dedicated regulator, the ICO is best placed and equipped to publish guidance and ensure consistency of application.
As a way to increase protections and incorporate more participation from those affected, Amendment 59A would add a new paragraph (7) to new Article 22D, which specifies that the Secretary of State needs to consult with the Information Commissioner’s Office if developing regulations. It also includes an obligation for the Secretary of State to consult with data subjects or their representatives, such as trade union or civil society organisations, at least every two years from the commencement of the Bill.
Our preference is for Clause 14 not to stand part of the Bill. The deployment of automated decision-making under Clause 14 risks automating harm, including discrimination, without adequate safeguards. Clause 14 creates a new starting point for all ADM using personal, but not special category, data. It is allowed, including for profiling, provided that certain safeguards are in place. The Minister said those safeguards are “appropriate” and “robust” and provide “certainty”, but I preferred what the noble Lord, Lord Bassam, said about the clause:
“We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts”.—[Official Report, 25/3/24; col. GC 150.]
That is very much my feeling about the clause as well.
I refer back to the impact assessment, which we discussed at some point during our discussions about Clause 9. It is very interesting that, in table 15 of the impact assessment, the savings on compliance costs are something like £7.3 million as regards AI and machine learning, which does not seem a very big number compared with the total savings on compliance costs, which the Government have put rather optimistically at £295 million.
In passing, I should say that, when I look at the savings regarding subject access requests, I see that the figure is £153 million, which is half of those so-called savings on compliance costs. I do not square that at all with what the Minister says about the total savings on compliance costs for subject access requests being 1%. I do not know quite where those figures come from, but it is a far more significant percentage: it is 50% of what the Government believe that the savings on compliance costs will be. I know that it is not part of this group, but I would be very grateful if the Minister could write to clarify that issue in due course.
Although the Minister has called these adequate, we believe that they are inadequate for three reasons. First, they shift the burden to the individual. Secondly, there is no obligation to provide any safeguards before the decision is made. Neither the Bill nor any of the material associated with it indicates what the content of this information is expected to be, nor the timescales in which that information is to be given. There is nothing to say when representations or contest may be heard, when human intervention may be sought or the level of that intervention. Thirdly, the Secretary of State has delegated powers to vary the safeguards by regulations.
Article 22 is currently one of the strongest prohibitions in the GDPR. As we know, the current starting point is that using solely automated decision-making is prohibited unless certain exemptions apply. The exemptions are limited. Now, as a result of the Government’s changes, you can use solely automated decision-making in an employment context in the UK, which you cannot do in the EU. That is a clear watering down of the restriction. The Minister keeps returning to the safeguards, but I have referred to those. We know that they are not being applied in practice even now and that hiring and firing is taking place without any kind of human review.
There is therefore an entirely inadequate basis on which we can be satisfied that the Bill will safeguard individuals from harmful automated decision-making before it is too late. In fact, the effect of the Bill will be to do the opposite: to permit unfair and unsafe ADM to occur, including discriminatory profiling ADM, which causes harm to individuals. It then places the burden on the individual to complain, without providing for any adequate safeguards to guarantee their ability to do so before the harm is already incurred. While I beg to move Amendment 53, our preference would be that Clause 14 is deleted from the Bill entirely.
My Lords, I will speak to Amendment 57 in my name, Amendment 59 in the name of the noble Baroness, Lady Jones, and the Clause 14 stand part notice from the noble Lord, Lord Clement-Jones. In doing so, I register my support for Amendment 59A in the name of the noble Lord, Lord Holmes.
The Government assert that there is no diminution of rights in the Bill, yet Clause 14 removes the right not to be subject to an automated decision and replaces that right with inadequate safeguards, as the noble Lord, Lord Clement-Jones, said. On the previous day in Committee, the Minister made the argument that:
“These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles”,—[Official Report, 25/3/24; col. GC 146.]
but I hope he will at least accept that safeguards do not constitute a right. The fact that the Secretary of State has delegated powers to change the safeguards at will undermines his argument that UK citizens have lost nothing at all; they have lost the right not to be subject to an automated decision.
The fact that the Government have left some guard-rails for special category data is in itself an indication that they know they are downgrading UK data rights, because the safeguards in place are not adequate. If they were adequate, it would be unnecessary to separate out SPC data in this way. I hammer the point home by asking the Minister to explain how the protections will work in practice in an era of AI when risks can come from inference and data analytics that do not use special category data but will still have a profound impact on the work lives, health, finances and opportunities of data subjects. If it is the case that data about your neighbourhood, shopping habits, search results, steps or entertainment choices is used to infer an important decision, how would a data subject activate their rights in that case?
As an illustration of this point, the daughter of a colleague of mine, who, as it happens, has a deep expertise in data law, this year undertook a video-based interview for a Russell group university with no human contact. It was not yet an ADM system, but we are inching ever closer to it. Removing the right, as the Government propose, would place the onus on students to complain or intervene—in a non-vexatious manner, of course. Will the Minister set out how UK citizens will be protected from life-changing decisions after government changes to Article 22, particularly as, in conjunction with other changes such as subject access requests and data impact assessments, UK citizens are about to have fewer routes to justice and less transparency of what is happening to their data?
I would also be grateful if the Minister could speak to whether he believes that the granularity and precision of current profiling deployed by AI and machine learning is sufficiently guaranteed to take this fundamental right away. Similarly, I hope that the known concerns about bias and fairness in ADM will be resolved over time, but we are not there yet, so why is it that the Government have a wait-and-see policy on regulation but are not offering the same “wait and see” in relation to data rights?
On Amendment 59 in the name of the noble Baroness, Lady Jones, the number of workers anticipated to be impacted by AI is simply eye-watering. In last Friday’s debate on AI, it was said to be 300 million worldwide, and one in four across Europe. But how workers work with AI is not simply a scary vision of the near future; it is here now.
I have a family member who last year left an otherwise well-paid and socially useful job when they introduced surveillance on to his computer during his working from home. At the time, he said that the way in which it impacted on both his self-esteem and autonomy was so devastating that he felt like
“a cog in a machine or an Amazon worker with no agency or creativity”.
He was an exemplary employee: top of the bonus list and in all measurable ways the right person in the right job. Efficiency in work has a vital role but it is not the whole picture. We know that, if able and skilled workers lose their will to work, it comes at a considerable cost to the well-being of the nation and the public purse. Most jobs in future will involve working with or even collaborating with technology; ensuring that work is dignified and fair to the human components of this arrangement is not a drag on productivity but a necessity if society is to benefit from changes to technology.
I am processing what the Minister has just said. He said it complements the AI regulation framework, and then he went on to talk about the central risk function, the AI risk register and what the ICO is up to in terms of guidance, but I did not hear that the loosening of safeguards or rights under Clause 14 and Article 22 of the GDPR was heralded in the White Paper or the consultation. Where does that fit with the Government’s AI regulation strategy? There is a disjunct somewhere.
I reject the characterisation of Clause 14 or any part of the Bill as loosening the safeguards. It focuses on the outcomes and by being less prescriptive and more adaptive, its goal is to heighten the levels of safety of AI, whether through privacy or anything else. That is the purpose.
On Secretary of State powers in relation to ADM, the reforms will enable the Government to further describe what is and is not to be taken as a significant effect on a data subject and what is and is not to be taken as meaningful human—
My Lords, I think we should try to let the Minister make a little progress and see whether some of these questions are answered.
I am sorry, but I just do not accept that intervention. This is one of the most important clauses in the whole Bill and we have to spend quite a bit of time teasing it out. The Minister has just electrified us all in what he said about the nature of this clause, what the Government are trying to achieve and how it fits within their strategy, which is even more concerning than previously. I am very sorry, but I really do not believe that this is the right point for the Whip to intervene. I have been in this House for 25 years and have never seen an intervention of that kind.
Let me make the broad point that there is no single list of outcomes for the whole Bill but, as we go through clause by clause, I hope the philosophy behind it, of being less prescriptive about process and more prescriptive about the results of the process that we desire, should emerge—not just on Clause 14 but as the overall philosophy underlying the Bill. Regulation-making powers can also be used to vary the existing safeguards, add additional safeguards and remove additional safeguards added at a later date.
On the point about having regard, it is important that the law is drafted in a way that allows it to adapt as technology advances. Including prescriptive requirements in the legislation reduces this flexibility and undermines the purpose of this clause and these powers to provide additional legal clarity when it is deemed necessary and appropriate in the light of the fast-moving advances in and adoption of technologies relevant to automated decision-making. I would like to reassure noble Lords that the powers can be used only to vary the existing safeguards, add additional ones and remove them. They cannot remove any of the safeguards written into the legislation.
Amendments 53 to 55 and 69 to 71 concern the Secretary of State powers relating to the terms “significant decisions” and “meaningful human involvement”. These powers enable the Secretary of State to provide a description of decisions that do or do not have a significant effect on data subjects, and describe cases that can be taken to have, or not to have, meaningful human involvement. As technology adoption grows and new technologies emerge, these powers will enable the Government to provide legal clarity, if and when deemed necessary, to ensure that people are protected and have access to safeguards when they matter most. In respect of Amendment 59A, Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22.
Also, as has been observed—I take the point about the limitations of this, but I would like to make the point anyway—any changes to the regulations are subject to the affirmative procedure and so must be approved by both Houses. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, I would ask the noble Lord, Lord Clement Jones, and my noble friend Lord Holmes—were he here—not to press their amendments.
Amendment 57 in the name of the noble Baroness, Lady Kidron, seeks to ensure that, when exercising regulation-making powers in relation to the safeguards in Article 22 of the UK GDPR, the Secretary of State should uphold the level of protection that children are entitled to in the Data Protection Act 2018. As I have said before, Clause 50 requires the Secretary of State to consult the ICO and other persons he or she considers appropriate. The digital landscape and its technologies evolve rapidly, presenting new challenges in safeguarding children. Regular consultations with the ICO and stakeholders ensure that regulations remain relevant and responsive to emerging risks associated with solely automated decision-making. The ICO has a robust position on the protection of children, as evidenced through its guidance and, in particular, the age-appropriate design code. As such, I ask the noble Baroness not to press her amendment.
Amendments 58, 72 and 73 seek to prevent the Secretary of State varying any of the safeguards mentioned in the reformed clauses. As I assured noble Lords earlier, the powers in this provision can be used only to vary the existing safeguards, add additional safeguards and remove additional safeguards added by regulation in future; there is not a power to remove any of the safeguards.
My Lords, I feel less reassured after this debate than I did even at the end of our two groups on Monday. I thank all those who spoke in this debate. There is quite a large number of amendments in this group, but a lot of them go in the same direction. I was very taken by what the noble Baroness, Lady Kidron, said: if the Government are offering safeguards and not rights, that is really extremely worrying. I also very much take on board what the noble Baroness, Lady Harding, had to say. Yes, of course we are in favour of automated decision-making, as it will make a big difference to our public services and quite a lot of private businesses, but we have to create the right ground rules around it. That is what we are talking about. We all very much share the question of children having a higher bar. The noble Baroness, Lady Jones, outlined exactly why the Secretary of State’s powers either should not be there or should not be expressed in the way that they are. I very much hope that the Minister will write on that subject.
More broadly, there are huge issues here. I think that it was the noble Baroness, Lady Kidron, who first raised the fact that the Government seem to be regulating in a specific area relating to AI that is reducing rights. The Minister talks about now regulating outcomes, not process. As the noble Baroness, Lady Jones, said, we do not have any criteria—what KPIs are involved? The process is important—the ethics by which decisions are made and the transparency involved. I cannot see that it is simply about whether the outcome is such and such; it is about the way in which people make decisions. I know that people like talking about outcome-based regulation, but it is certainly not the only important aspect of regulation.
On the issue of removing prescriptiveness, I am in favour of ethical prescriptiveness, so I cannot see that the Minister has made a particularly good case for the changes made under Clause 14. He talked about having access to safeguards when they matter most. It would be far preferable to have rights that can be exercised in the face of automated decision-making, in particular workplace protection. At various points during the debates on the Bill we have touched on things such as algorithmic impact assessment in the workplace and no doubt we will touch on it further. That is of great and growing importance, but again there is no recognition of that.
I am afraid that the Minister has not made a fantastic case for keeping Clause 14 and I think that most of us will want to kick the tyres and carry on interrogating whether it should be part of the Bill. In the meantime, I beg leave to withdraw Amendment 53.
My Lords, the Central Digital and Data Office, or CDDO, and the Centre for Data Ethics and Innovation, as it was then called—it now has a new name as a unit of DSIT—launched the algorithmic transparency recording standard in November 2021. The idea for the ATRS arose from a recommendation by the CDEI that the UK Government should place a mandatory transparency obligation on public sector organisations using algorithms to support “significant decisions affecting individuals”. It is intended to help public sector organisations to provide clear information about the algorithmic tools that they use, how they operate and why they are using them.
The ATRS is a promising initiative that could go some way to addressing the current transparency deficit around the use of algorithmic and AI tools by public authorities. Organisations are encouraged to submit reports about each algorithmic tool that they are using that falls within the scope of the standard.
We welcome the recent commitments made in the Government’s response to the AI regulation White Paper consultation to make the ATRS a requirement for all government departments. However, we believe that this is an opportunity to deliver on this commitment through the DPDI Bill, by placing it on a statutory footing rather than it being limited to a requirement in guidance. That is what Amendment 74 is designed to do.
We also propose another new clause that should reflect the Government’s commitment to algorithmic transparency. It would require the Secretary of State to introduce a compulsory transparency reporting requirement, but only when she or he considers it appropriate to do so. It is a slight watering-down of Amendment 74, but it is designed to tempt the Minister into further indiscretions. In support of transparency, the new clause would, for as long as the Secretary of State considers making the ATRS compulsorily inappropriate, also require the Secretary of State to regularly explain why and keep her decision under continual review.
Amendment 76 on safe and responsible automated decision systems proposes a new clause that seeks to shift the burden back on public sector actors. It puts the onus on them to ensure safety and prevent harm, rather than waiting for harm to occur and putting the burden on individuals to challenge it. It imposes a proactive statutory duty, similar to the public sector equality duty under Section 149 of the Equality Act 2010, to have “due regard” to ensuring that
“automated decision systems … are responsible and minimise harm to individuals and society at large”.
The duty incorporates the key principles in the Government’s AI White Paper and therefore is consistent with its substantive approach. It also includes duties to be proportionate, to give effect to individuals’ human rights and freedoms and to safeguard democracy and the rule of law. It applies to all “automated decision systems”. These are
“any tool, model, software, system, process, function, program, method and/or formula designed with or using computation to automate, analyse, aid, augment, and/or replace human decisions that impact the welfare, rights and freedoms of individuals”.
This therefore applies to partly automated decisions, as well as those that are entirely automated, and systems in which multiple automated decision processes take place.
It applies to traditional public sector actors: public authorities, or those exercising public functions, including private actors outsourced by the Government to do so; those that may exercise control over automated decision systems, including regulators; as well as those using data collected or held by a public authority, which may be public or private actors. It then provides one mandatory mechanism through which compliance with the duty must be achieved—impact assessments. We had a small debate about the ATRS and whether a compliance system was in place. It would be useful to see whether the Minister has any further comment on that, but I think that he disagreed with my characterisation that there is no compliance system currently.
This provision proposes impact assessments. The term used, “algorithmic impact assessment”, is adopted from Canada’s analogous directive on automated decision-making, which mandates the use of AIAs for all public sector automated decision systems. The obligation is on the Secretary of State, via regulations, to set out a framework for AIAs, which would help actors to uphold their duty to ensure that automated decision systems are responsible and safe; to understand and to reduce the risks in a proactive and ongoing way; to introduce the appropriate governance, oversight, reporting and auditing requirements; and to communicate in a transparent and accessible way to affected individuals and the wider public.
Amendment 252 would require a list of UK addresses to be made freely available for reuse. Addresses have been identified as a fundamental geospatial dataset by the UN and a high-value dataset by the EU. Address data is used by tens of thousands of UK businesses, including for delivery services and navigation software. Crucially, address data can join together different property-related data, such as energy performance certificates or Land Registry records, without using personal information. This increases the value of other high-value public data.
I feel under amazing pressure to get the names right, especially given the number of hours we spend together.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for tabling Amendments 74 to 78, 144 and 252 in this group. I also extend my thanks to noble Lords who have signed the amendments and spoken so eloquently in this debate.
Amendments 74 to 78 would place a legislative obligation on public authorities and all persons in the exercise of a public function to publish reports under the Algorithmic Transparency Recording Standard—ATRS—or to publish algorithmic impact assessments. These would provide information on algorithmic tools and algorithm-assisted decisions that process personal data in the exercise of a public function or those that have a direct or indirect public effect or directly interact with the general public. I remind noble Lords that the UK’s data protection laws will continue to apply throughout the processing of personal data.
The Government are already taking action to establish the necessary guard-rails for AI, including to promote transparency. In the AI regulation White Paper response, we announced that the use of the ATRS will now become a requirement for all government departments and the broader public sector. The Government are phasing this in as we speak and will check compliance accordingly, as DSIT has been in contact with every department on this issue.
In making this policy, the Government are taking an approach that provides increasing degrees of mandation of the ATRS, with appropriate exemptions, allowing them to monitor compliance and effectiveness. The announcement in the White Paper response has already led to more engagement from across government, and more records are under way. The existing process focuses on the importance of continuous improvement and development. Enshrining the standard into law prematurely, amid exponential technological change, could hinder its adaptability.
More broadly, our AI White Paper outlined a proportionate and adaptable framework for regulating AI. As part of that, we expect AI development and use to be fair, transparent and secure. We set out five key principles for UK regulators to interpret and apply within their remits. This approach reflects the fact that AI systems are not unregulated and need to be compliant with existing regulatory frameworks, including employment, human rights, health and safety and data protection law.
For instance, the UK’s data protection legislation imposes obligations on data controllers, including providers and users of AI systems, to process personal data fairly, lawfully and transparently. Our reforms in this Bill will ensure that, where solely automated decision-making is undertaken—that is, ADM without any meaningful human involvement that has significant effects on data subjects—data subjects will have a right to the relevant safeguards. These safeguards include being provided with information on the ADM that has been carried out and the right to contest those decisions and seek human review, enabling controllers to take suitable measures to correct those that have produced wrongful outcomes.
My Lords, I wonder whether the Minister can comment on this; he can write if he needs to. Is he saying that, in effect, the ATRS is giving the citizen greater rights than are ordinarily available under Article 22? Is that the actual outcome? If, for instance, every government department adopted ATRS, would that, in practice, give citizens a greater degree of what he might put as safeguards but, in this context, he is describing as rights?
I am very happy to write to the noble Lord, but I do not believe that the existence of an ATRS-generated report in and of itself confers more rights on anybody. Rather, it makes it easier for citizens to understand how their rights are being used, what rights they have, or what data about them is being used by the department concerned. The existence of data does not in and of itself confer new rights on anybody.
I understand that, but if he rewinds the reel he will find that he was talking about the citizen’s right of access, or something of that sort, at that point. Once you know what data is being used, the citizen has certain rights. I do not know whether that follows from the ATRS or he was just describing that at large.
As I said, I will write. I do not believe that follows axiomatically from the ATRS’s existence.
On Amendment 144, the Government are sympathetic to the idea that the ICO should respond to new and emerging technologies, including the use of children’s data in the development of AI. I assure noble Lords that this area will continue to be a focus of the ICO’s work and that it already has extensive powers to provide additional guidance or make updates to the age-appropriate design code, to ensure that it reflects new developments, and a responsibility to keep it up to date. The ICO has a public task under Article 57(1)(b) of the UK GDPR to
“promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing”.
It is already explicit that:
“Activities addressed specifically to children shall receive specific attention”.
That code already includes a chapter on profiling and provides guidance on fairness and transparency requirements around automated decision-making.
Taking the specific point made by the noble Baroness, Lady Kidron, on the contents of the ICO’s guidance, while I cannot speak to the ICO’s decisions about the drafting of its guidance, I am content to undertake to speak to it about this issue. I note that it is important to be careful to avoid a requirement for the ICO to duplicate work. The creation of an additional children’s code focused on AI could risk fragmenting approaches to children’s protections in the existing AADC—a point made by the noble Baroness and by my noble friend Lady Harding.
We have some numbers that I will come to, but I am very happy to share deeper analysis of that with all noble Lords.
There is also free access to this data for developers to innovate in the market. The Government also make this data available for free at the point of use to more than 6,000 public sector organisations, as well as postcode, unique identifier and location data available under open terms. The Government explored opening address data in 2016. At that time, it became clear that the Government would have to pay to make this data available openly or to recreate it. That was previously attempted, and the resulting dataset had, I am afraid, critical quality issues. As such, it was determined at that time that the changes would result in significant additional cost to taxpayers and represent low value for money, given the current widespread accessibility of the data. For the reasons I have set out, I hope that the noble Lords will withdraw their amendments.
My Lords, I thank the Minister for his response. There are a number of different elements to this group.
The one bright spot in the White Paper consultation is the ATRS. That was what the initial amendments in this group were designed to give a fair wind to. As the noble Lord, Lord Bassam, said, this is designed to assist in the adoption of the ATRS, and I am grateful for his support on that.
My Lords, I will speak to almost all the amendments in this group, other than those proposed by the noble Baroness, Lady Kidron. I am afraid that this is a huge group; we probably should have split it to have a better debate, but that is history.
I very much support what the noble Baroness said about her amendments, particularly Amendment 79. The mandation of ethics by design is absolutely crucial. There are standards from organisations such as the IEEE for that kind of ethics by design in AI systems. I believe that it is possible to do exactly what she suggested, and we should incorporate that into the Bill. It illustrates that process is as important as outcomes. We are getting to a kind of philosophical approach here, which illustrates the differences between how some of us and the Government are approaching these things. How you do something, the way you design it and the fact that it needs to be ethical is absolutely cardinal in any discussion—particularly about artificial intelligence. I do not think that it is good enough simply to talk about the results of what AI does without examining how it does it.
Having said that, I turn to Amendment 80 and the Clause 16 stand part notice. Under Clause 16, the Government are proposing to remove Article 27 of the UK GDPR without any replacement. By removing the legal requirement on non-UK companies to retain a UK representative, the Government would deprive individuals of a local, accessible point of contact through which people can make data protection rights requests. That decision threatens people’s capacity to exercise their rights, reducing their ability to remain in control of their personal information.
The Government say that removing Article 27 will boost trade with the UK by reducing the compliance burden on non-UK businesses. But they have produced little evidence to support the notion that this will be the case and have overlooked the benefits in operational efficiency and cost savings that the representative can bring to non-UK companies. Even more worryingly, the Government appear to have made no assessment of the impact of the change on UK individuals, in particular vulnerable groups such as children. It is an ill-considered policy decision that would see the UK take a backward step in regulation at a time when numerous other jurisdictions, such as Switzerland, Turkey, South Korea, China and Thailand, are choosing to safeguard the extraterritorial application of their data protection regimes through the implementation of the legal requirement to appoint a representative.
The UK representative ensures that anyone in the UK wishing to make a privacy-related request has a local, accessible point of contact through which to do so. The representative plays a critical role in helping people to access non-UK companies and hold them accountable for the processing of their data. The representative further provides a direct link between the ICO and non-UK companies to enable the ICO to enforce the UK data protection regime against organisations outside the UK.
On the trade issue, the Government argue that by eliminating the cost of retaining a UK representative, non-UK companies will be more inclined to offer goods and services to individuals in the UK. Although there is undeniably a cost to non-UK companies of retaining a representative, the costs are significantly lower than the rather disproportionately inflated figures that were cited in the original impact assessment, which in some cases were up to 10 times the average market rate for representative services. The Government have put forward very little evidence to support the notion that removing Article 27 will boost trade with the UK.
There is an alternative approach. Currently, the Article 27 requirement to appoint a UK representative applies to data controllers and processors. An alternative approach to the removal of Article 27 in its entirety would be to retain the requirement but limit its scope so that it applies only to controllers. Along with the existing exemption at Article 27(2), this would reduce the number of non-UK companies required to appoint a representative, while arguably still preserving a local point of contact through which individuals in the UK can exercise their rights, as it is data controllers that are obliged under Articles 15 to 22 of the UK GDPR to respond to data subject access requests. That is a middle way that the Government could adopt.
Moving to Amendment 82, at present, the roles of senior responsible individual in the Bill and data protection officer under the EU GDPR appear to be incompatible. That is because the SRI is part of the organisation’s senior management, whereas a DPO must be independent of an organisation’s senior management. This puts organisations caught by both the EU GDPR and the UK GDPR in an impossible situation. At the very least, the Government must explain how they consider that these organisations can comply with both regimes in respect of the SRI and DPO provisions.
The idea of getting rid of the DPO runs completely contrary to the way in which we need to think about accountability for AI systems. We need senior management who understand the corporate significance of the AI systems they are adopting within the business. The ideal way forward would be for the DPO to be responsible for that when AI regulation comes in, but the Government seem to be completely oblivious to that. Again, it is highly frustrating for those of us who thought we had a pretty decent data protection regime to find this kind of watering down taking place in the face of the risks from artificial intelligence that are becoming more and more apparent as the days go by. I firmly believe that it will inhibit the application and adoption of AI within businesses if we do not have public trust and business certainty.
I now come to oppose the question that Clause 18, on the duty to keep records, stand part of the Bill. This clause seems to masquerade as an attempt to get rid of red tape. In reality, it makes organisations less likely to be compliant with the main obligations in the UK GDPR, as it will be amended by the Bill, and therefore heightens the risk both to the data subjects whose data they hold and to the organisations in terms of non-compliance. This is, of course, the duty to keep records. It is particularly unfair on small businesses that do not have the resources to take advice on these matters. Records of processing activities are one of the main ways in which organisations can meet the requirements of Article 5(2) of the UK GDPR to demonstrate their compliance. The obligation to demonstrate compliance remains unaltered under the Bill. Therefore, dispensing with the main way of achieving compliance with Article 5(2) is impractical and unhelpful.
At this point, I should say that we support Amendment 81 in the name of the noble Baroness, Lady Jones, which concerns the assessment of high-risk processing.
Our amendments on data protection impact assessments are Amendments 87, 88 and 89. Such assessments are currently required under Article 35 of the UK GDPR and are essential to ensuring that organisations do not deploy, and individuals are not subjected to, systems that may lead to unlawful, rights-violating or discriminatory outcomes. The Government’s data consultation response noted:
“The majority of respondents agreed that data protection impact assessments requirements are helpful in identifying and mitigating risk, and disagreed with the proposal to remove the requirement to undertake data protection impact assessments”.
However, under Clause 20, the requirement to perform an impact assessment would be seriously diluted. That is all I need to say. The Government frequently pray in aid the consultation—they say, “Well, we did that because of the consultation”—so why are they flying in the face of it? That seems an extraordinary thing to do in circumstances where impact assessments are regarded as a useful tool and training by business has clearly adjusted to them over the years since the Data Protection Act 2018.
My Lords, I rise to speak in support Amendments 79, 83, 85, 86, 93, 96, 97, 105 and 107, to which I have added my name. An awful lot has already been said. Given the hour of the day, I will try to be brief, but I want to speak to the child amendments I have put my name to and to the non-child ones and to raise things up a level.
The noble Lord, Lord Clement-Jones, talked about trust. I have spent the best part of the past 15 years running consumer and citizen digitally enabled services. The benefit that technology brings to life is clear to me but—this is a really important “but”—our customers and citizens need to trust what we do with their data, so establishing trust is really important.
One the bedrock of that trust is forcing—as a non-technologist, I use that word advisedly—technologists to set out what they are trying to do, what the technology they propose to build will do and what the risks and opportunities of that technology are. My experience as a non-engineer is that when you put engineers under pressure, they can speak English, but it is not their preferred language. They do not find it easy to articulate the risks and opportunities of the technology they are building, which is why forcing businesses that build these services to set out in advance the data protection impacts of the services they are building is so important. It is also why you have to design with safety in mind upfront because technology is so hard to retrofit. If you do not design it up front with ethics and safety at its core, it is gone by the time you see the impact in the real world.
I thank the noble Baronesses, Lady Kidron and Lady Jones, and the noble Lord, Lord Clement-Jones, for their amendments, and I look forward to receiving the letter from the noble Baroness, Lady Kidron, which I will respond to as quickly as I can. As everybody observed, this is a huge group, and it has been very difficult for everybody to do justice to all the points. I shall do my best, but these are points that go to the heart of the changes we are making. I am very happy to continue engaging on that basis, because we need plenty of time to review them—but, that said, off we go.
The changes the Government are making to the accountability obligations are intended to make the law clearer and less prescriptive. They will enable organisations to focus on areas that pose high risks to people resulting, the Government believe, in improved outcomes. The new provisions on assessments of high-risk processing are less prescriptive about the precise circumstances in which a risk assessment would be required, as we think organisations are best placed to judge whether a particular activity poses a high risk to individuals in the context of the situation.
However, the Government are still committed to high standards of data protection, and there are many similarities between our new risk assessment measures and the previous provisions. When an organisation is carrying out processing activities that are likely to pose a high risk to individuals, it will still be expected to document that processing, assess risks and identify mitigations. As before, no such document would be required where organisations are carrying out low-risk processing activities.
One of the main aims of the Bill is to remove some of the UK GDPR’s unnecessary compliance burdens. That is why organisations will be required to designate senior responsible individuals, keep records of processing and carry out the risk assessments above only when their activities pose high risks to individuals.
The noble Viscount is very interestingly unpacking a risk-based approach to data protection under the Bill. Why are the Government not taking a risk-based approach to their AI regulation? After all, the AI Act approaches it in exactly that way.
I will briefly address it now. Based on that letter, the Government’s view is to avoid prescription and I believe that the ICO’s view— I cannot speak for it—is generally the same, except for a few examples where prescription needs to be specified in the Bill. I will continue to engage with the ICO on where exactly to draw that line.
My Lords, I can see that there is a difference of opinion, but it is unusual for a regulator to go into print with it. Not only that, but he has set it all out in an annexe. What discussion is taking place directly between the Minister and his team and the ICO? There seems to be quite a gulf between them. This is number 1 among his “areas of ongoing concern”.
I do not know whether it is usual or unusual for the regulator to engage in this way, but the Bill team engages with the Information Commissioner frequently and regularly, and, needless to say, it will continue to do so on this and other matters.
Children need particular protection when organisations are collecting and processing their personal data, because they may be less aware of the risks involved. If organisations process children’s personal data, they should think about the need to protect them from the outset and design their systems and processes with this in mind.
Before I turn to the substance of what the Bill does with the provisions on high-risk processing, I will deal with the first amendment in this group: Amendment 79. It would require data processors to consider data protection-by-design requirements in the same way that data controllers do, because there is a concern that controllers may not always be able to foresee what processors do with people’s data for services such as AI and cloud computing.
However, under the current legislation, it should not be for the processor to determine the nature or purposes of the processing activity, as it will enter a binding controller-processor agreement or contract to deliver a specific task. Processors also have specific duties under the UK GDPR to keep personal data safe and secure, which should mean that this amendment is not necessary.
I turn to the Clause 16 stand part notice, which seeks to remove Clause 16 from the Bill and reinstate Article 27, and Amendment 80, which seeks to do the same but just in respect of overseas data controllers, not processors. I assure the noble Lord, Lord Clement-Jones, that, even without the Article 27 representative requirement, controllers and processors will still have to maintain contact and co-operation with UK data subjects and the ICO to comply with the UK GDPR provisions. These include Articles 12 to 14, which, taken together, require controllers to provide their contact details in a concise, transparent, intelligible and easily accessible form, using clear and plain language, particularly for any information addressed specifically to a child.
By offering firms a choice on whether to appoint a representative in the UK to help them with UK GDPR compliance and no longer mandating organisations to appoint a representative, we are allowing organisations to decide for themselves the best way to comply with the existing requirements for effective communication and co-operation. Removing the representative requirement will also reduce unnecessary burdens on non-UK controllers and processors while maintaining data subjects’ safeguards and rights. Any costs associated with appointing a representative are a burden on and a barrier to trade. Although the variety of packages made available by representative provider organisations differ, our assessments show that the cost of appointing representatives increases with the size of a firm. Furthermore, there are several jurisdictions that do not have a mandatory or equivalent representative requirement in their data protection law, including other countries in receipt of EU data adequacy decisions.
Nevertheless, does the Minister accept that quite a lot of countries have now begun the process of requiring representatives to be appointed? How does he account for that? Does he accept that what the Government are doing is placing the interests of business over those of data subjects in this context?
No, I do not accept that at all. I would suggest that we are saying to businesses, “You must provide access to the ICO and data subjects in a way that is usable by all parties, but you must do so in the manner that makes the most sense to you”. That is a good example of going after outcomes but not insisting on any particular process or methodology in a one-size-fits-all way.
Yes—if the person they were supposed to communicate with did not speak English or was not available during reasonable hours, that would be in violation of the requirement.
I apologise if we briefly revisit some of our earlier discussion here, but Amendment 81 would reintroduce a list of high-risk processing activities drawn from Article 35 of the UK GDPR, with a view to helping data controllers comply with the new requirements around designating a senior responsible individual.
The Government have consulted closely with the ICO throughout the development of all the provisions in the Bill, and we welcome its feedback as it upholds data subjects’ rights. We recognise and respect that the ICO’s view on this issue is different to the Government’s, but the Government feel that adding a prescriptive list to the legislation would not be appropriate for the reasons we have discussed. However, as I say, we will continue to engage with it over the course of the passage of the Bill.
Some of the language in Article 35 of the UK GDPR is unclear and confusing, which is partly why we removed it in the first place. We believe organisations should have the ability to make a judgment of risk based on the specific nature, scale and context of their own processing activities. We do not need to provide prescriptive examples of high-risk processing on the face of legislation because any list could quickly become out of date. Instead, to help data controllers, Clause 20 requires the ICO to produce a document with examples of what the commissioner considers to be high-risk processing activities.
I turn to Clause 17 and Amendment 82. The changes we are making in the Bill will reduce prescription by removing the requirement to appoint a data protection officer in certain circumstances. Instead, public bodies and other organisations carrying out high-risk processing activities will have to designate a senior responsible individual to ensure that data protection risks are managed effectively within their organisations. That person will have flexibility about how they manage data protection risks. They might decide to delegate tasks to independent data protection experts or upskill existing staff members, but they will not be forced to appoint data protection officers if suitable alternatives are available.
The primary rationale for moving to a senior responsible individual model is to embed data protection at the heart of an organisation by ensuring that someone in senior management takes responsibility and accountability for it if the organisation is a public body or is carrying out high-risk processing. If organisations have already appointed data protection officers and want to keep an independent expert to advise them, they will be free to do so, providing that they also designate a senior manager to take overall accountability and provide sufficient support, including resources.
Amendment 83, tabled by the noble Baroness, Lady Kidron, would require the senior responsible individual to specifically consider the risks to children when advising the controller on its responsibilities. As drafted, Clause 17 of the Bill requires the senior responsible individual to perform a number of tasks or, if they cannot do so themselves, to make sure that they are performed by another person. They include monitoring the controller’s compliance with the legislation, advising the controller of its obligations and organising relevant training for employees who carry out the processing of personal data. Where the organisation is processing children’s data, all these requirements will be relevant. The senior responsible individual will need to make sure that any guidance and training reflects the type of data being processed and any specific obligations the controller has in respect of that data. I hope that this goes some way to convincing the noble Baroness not to press her amendment.
The Minister has not really explained the reason for the switch from the DPO to the new system. Is it another one of his “We don’t want a one-size-fits-all approach” arguments? What is the underlying rationale for it? Looking at compliance costs, which the Government seem to be very keen on, we will potentially have a whole new cadre of people who will need to be trained in compliance requirements.
The data protection officer—I speak as a recovering data protection officer—is tasked with certain specific outcomes but does not necessarily have to be a senior person within the organisation. Indeed, in many cases, they can be an external adviser to the organisation. On the other hand, the senior responsible individual is a senior or board-level representative within the organisation and can take overall accountability for data privacy and data protection for that organisation. Once that accountable person is appointed, he or she can of course appoint a DPO or equivalent role or separate the role among other people as they see fit. That gives everybody the flexibility to meet the needs of privacy as they see fit, but not necessarily in a one-size-fits-all way. That is the philosophical approach.
Does the Minister accept that the SRI will have to cope with having at least a glimmering of an understanding of what will be a rather large Act?
Yes, the SRI will absolutely have to understand all the organisation’s obligations under this Act and indeed other Acts. As with any senior person in any organisation responsible for compliance, they will need to understand the laws that they are complying with.
Amendment 84, tabled by the noble Lord, Lord Clement-Jones, is about the advice given to senior responsible individuals by the ICO. We believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. The amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without full knowledge of the facts, undermining their regulatory enforcement role.
As long as that applies to us on occasion as well.
My Lords, just in passing, I will say that I am beginning to feel that the decision made by the Privileges Committee and now the House is beginning to creak in terms of the very first Grand Committee that it has encountered. So, in terms of time limits, I think flexibility on Grand Committee in particular is absolutely crucial. I am afraid that the current procedures will not necessarily stand the test of time—but we shall see.
This is a relatively short debate on whether Clause 19 should stand part, but it is a really significant clause, and it is another non-trust-engendering provision. This basically takes away the duty of the police to provide justification for why they are consulting or sharing personal data. Prompted by the National AIDS Trust, we believe that the Bill must retain the duty on police forces to justify why they have accessed an individual’s personal data.
This clause removes an important check on police processing of an individual’s personal data. The NAT has been involved in cases of people living with HIV whose HIV status was shared without their consent by police officers, both internally within their police station and within the wider communities that they serve. Therefore, ensuring that police officers justify why they have accessed an individual’s personal data is vital evidence in cases of police misconduct. Such cases include when a person’s HIV status is shared inappropriately by the police, or when it is not relevant to an investigation of criminal activity.
The noble Baroness, Lady Kidron, was extremely eloquent in her winding up of the last group. The Minister really needs to come back and tell us what on earth the motivation is behind this particular Clause 19. I beg to move that this clause should not stand part of the Bill.
This is a mercifully short group on this occasion. I thank the noble Lord, Lord Clement-Jones, for the amendment, which seeks to remove Clause 19 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record when personal data has been accessed and why. Clause 19 does not remove the need for police to justify their processing; it simply removes the ineffective administrative requirement to record that justification in a log.
The justification entry was intended to help to monitor and detect unlawful access. However, the reality is that anyone accessing data unlawfully is very unlikely to record an honest justification, making this in practice an unreliable means of monitoring misconduct or unlawful processing. Records of when data was accessed and by whom can be automatically captured and will remain, thereby continuing to ensure accountability.
In addition, the National Police Chiefs’ Council’s view is that this change will not hamper any investigations to identify the unlawful processing of data. That is because it is unlikely that an individual accessing data unlawfully would enter an honest justification, so capturing this information is unlikely to be useful in any investigation into misconduct. The requirements to record the time, date and, as far as possible, the identity of the person accessing the data will remain, as will the obligation that there is lawful reason for the access, ensuring that accountability and protection for data subjects is maintained.
Police officers inform us that the current requirement places an unnecessary burden on them as they have to update the log manually. The Government estimate that the clause could save approximately 1.5 million policing hours, representing a saving in the region of £46.5 million per year.
I understand that the amendment relates to representations made by the National AIDS Trust concerning the level of protection for people’s HIV status. As I believe I said on Monday, the Government agree that the protection of people’s HIV status is vital. We have met the National AIDS Trust to discuss the best solutions to the problems it has raised. For these reasons, I hope the noble Lord will not oppose Clause 19 standing part.
I thank the Minister for his response, but he has left us tantalised about the outcome of his meeting. What is the solution that he has suggested? We are none the wiser as a result of his response.
This pudding has been well over-egged by the National Police Chiefs’ Council. Already, only certain senior officers and the data protection leads in police forces have access to this functionality. There will continue to be a legal requirement to record the time and date of access. They are required to follow a College of Policing code of practice. Is the Minister really saying that recording a justification for accessing personal data is such an onerous requirement that £46.5 million in police time will be saved as a result of this? Over what period? That sounds completely disproportionate.
The fact is that the recording of the justification, whether or not it is false and cannot be relied upon as evidence, is rather useful because it is evidence of police misconduct in relation to inappropriately accessing personal data. They are actually saying: “We did it for this purpose”, when it clearly was not. I am not at all surprised that the National AIDS Trust is worried about this. The College of Policing code of practice does not mention logging requirements in detail. It references them just once in relation to automated systems that process data.
I am extremely grateful to the noble Lord, Lord Bassam, for what he had to say. It seems to me that we do not have any confidence on this side of the House that removing this requirement provides enough security that officers will be held to account if they share an individual’s special category data inappropriately. I do not think the Minister has really answered the concerns, but I beg leave to withdraw my objection to the clause standing part.
My Lords, given the hour, I will be brief. That was an absolute tour de force by the noble Baroness. As with all the Minister’s speeches, I will read her speech over Easter.
I was very interested to be reminded of the history of Napster, because that was when many of us realised that we were, in many ways, entering the digital age in the creative industries and beyond. The amendments that the noble Baroness put forward are examples of where the Bill could make a positive impact, unlike the impact that so much of the rest of it is making in watering down rights. She described cogently how large language models are ingesting or scraping data from the internet, social media and journalism, how very close to the ingestion of copyright material this whole agenda is and how it is being done by anonymous bots in particular. It fits very well with the debate in which the Minister was involved last Friday on the Private Member’s Bill of the noble Lord, Lord Holmes, who inserted a clause requiring transparency on the ingestion or scraping of data and copyright material by large language models. It is very interesting.
The opportunity in the data area is currently much greater than it is in the intellectual property area. At least we have the ICO, which is a regulator, unlike the IPO, which is not really a regulator with teeth. I am very interested in the fact that the ICO is conducting a consultation on generative AI and data protection, which it launched in January. Conterminously with this Bill, perhaps the ICO might come to some conclusions that we can use. That would of course include the whole area of biometrics, which, in the light of things such as deepfakes and so on, is increasingly an issue of great concern. The watchword is “transparency”: we must impose a duty on the generative AI models about the use of the material that they use to train their models and then use in operation. I fully support Amendments 103 and 104 in the name of the noble Baroness, even though, as she describes them, they are a small step.
My Lords, I, too, will be relatively brief. I thank the noble Baroness, Lady Kidron, for her amendments, to which I was very pleased to add my name. She raised an important point about the practice of web scrapers, who take data from a variety of sources to construct large language models without the knowledge or permission of web owners and data subjects. This is a huge issue that should have been a much more central focus of the Bill. Like the noble Baroness, I am sorry that the Government did not see fit to use the Bill to bring in some controls on this increasingly prevalent practice, because that would have been a more constructive use of our time than debating the many unnecessary changes that we have been debating so far.
As the noble Baroness said, large language models are built on capturing text, data and images from infinite sources without the permission of the original creator of the material. As she also said, it is making a mockery of our existing data rights. It raises issues around copyright and intellectual property, and around personal information that is provided for one purpose and commandeered by web scrapers for another. That process often happens in the shadows, whereby the owner of the information finds out only much later that their content has been repurposed.
What is worse is that the application of AI means that material provided in good faith can be distorted or corrupted by the bots scraping the internet. The current generation of LLMs are notorious for hallucinations in which good quality research or journalistic copy is misrepresented or misquoted in its new incarnation. There are also numerous examples of bias creeping into the LLM output, which includes personal data. As the noble Baroness rightly said, the casual scraping of children’s images and data is undermining the very essence of our existing data protection legislation.
It is welcome that the Information Commissioner has intervened on this. He argued that LLMs should be compliant with the Data Protection Act and should evidence how they are complying with their legal obligations. This includes individuals being able to exercise their information rights. Currently, we are a long way from that being a reality and a practice. This is about enforcement as much as giving guidance.
I am pleased that the noble Baroness tabled these amendments. They raise important issues about individuals giving prior permission for their data to be used unless there is an easily accessible opt-out mechanism. I would like to know what the Minister thinks about all this. Does he think that the current legislation is sufficient to regulate the rise of LLMs? If it is not, what are the Government doing to address the increasingly widespread concerns about the legitimacy of web scraping? Have the Government considered using the Bill to introduce additional powers to protect against the misuse of personal and creative output?
In the meantime, does the Minister accept the amendments in the name of the noble Baroness, Lady Kidron? As we have said, they are only a small part of a much bigger problem, but they are a helpful initiative to build in some basic protections in the use of personal data. This is a real challenge to the Government to step up to the mark and be seen to address these important issues. I hope the Minister will say that he is happy to work with the noble Baroness and others to take these issues forward. We would be doing a good service to data citizens around the country if we did so.
My Lords, UK law enforcement authorities processing personal data for law enforcement purposes currently use internationally based companies for data processing services, including cloud storage. The use of international processors is critical for modern organisations and law enforcement is no exception. The use of these international processors enhances law enforcement capabilities and underpins day-to-day functions.
Transfers from a UK law enforcement authority to an international processor are currently permissible under the Data Protection Act 2018. However, there is currently no bespoke mechanism for these transfers in Part 3, which has led to confusion and ambiguity as to how law enforcement authorities should approach the use of such processors. The aim of this amendment is to provide legal certainty to law enforcement authorities in the UK, as well as transparency to the public, so that they can use internationally based processors with confidence.
I have therefore tabled Amendments 110, 117 to 120, 122 to 129 and 131 to provide a clear, bespoke mechanism in Part 3 of the Data Protection Act 2018 for UK law enforcement authorities to use when transferring data to their contracted processors based outside the UK. This will bring Part 3 into line with the UK GDPR while clarifying the current law, and give UK law enforcement authorities greater confidence when making such transfers to their contracted processors for law enforcement purposes.
We have amended Section 73—the general principles for transfer—to include a specific reference to processors, ensuring that international processors can be a recipient of data transfers. In doing so, we have ensured that the safeguards within Chapter 5 that UK law enforcement authorities routinely apply to transfers of data to their international operational equivalents are equally applicable to transfers to processors. We are keeping open all the transfer mechanisms so that data can be transferred on the basis of an applicable adequacy regulation, the appropriate safeguards or potentially the special circumstances.
We have further amended Section 75—the appropriate safeguards provision—to include a power for the ICO to create, specifically for Part 3, an international data transfer agreement, or IDTA, to complement the IDTA which it has already produced to facilitate transfers using Article 46(2)(d) of the UK GDPR.
In respect of transfers to processors, we have disapplied the duty to inform the Information Commissioner about international transfers made subject to appropriate safeguards. As such, a requirement would be out of line with equivalent provisions in the UK GDPR. There is no strong rationale for complying with the provision, given that processors are limited in what they can do with data because of the nature of their contracts and that it would be unlikely to contribute to the effective functioning of the ICO.
Likewise, we have also disapplied the duty to document such transfers and to provide the documentation to the commissioner on request. This is because extending these provisions would duplicate requirements that already exist elsewhere in legislation, including in Section 61, which has extensive recording requirements that enable full accountability to the ICO.
We have also disapplied the majority of Section 78. While it provides a useful function in the context of UK law enforcement authorities transferring to their international operational equivalents, in the law enforcement to international processor context it is not appropriate because processors cannot decide to transfer data onwards on their own volition. They can only do so under instruction from the UK law enforcement authority controller.
Instead, we have retained the general prohibition on any further transfers to processors based in a separate third country by requiring UK law enforcement authority controllers to make it a condition of a transfer to its processor that data is only to be further transferred in line with the terms of the contract with or authorisation given by the controller, and where the further transfer is permitted under Section 73. We have also taken the opportunity to tidy up Section 77 which governs transfers to non-relevant authorities, relevant international organisations or international processors.
In respect of Amendment 121, tabled by the noble Lord, Lord Clement-Jones, on consultation with the Information Commissioner, I reassure the noble Lord that there is a memorandum of understanding between the Home Office and the Information Commissioner regarding international transfers approved by regulations, which sets out the role and responsibilities of the ICO. As part of this, the Home Office consults the Information Commissioner at various stages in the process. The commissioner, in turn, provides independent assurance and advice on the process followed and on the factors taken into consideration.
I understand that this amendment also relates to representations made by the National AIDS Trust. Perhaps the simplest thing is merely to reference my earlier remarks and commitment to engage with the National AIDS Trust ongoing. I beg to move that the government amendments which lead this group stand part of the Bill.
My Lords, very briefly, I thank the Minister for unpacking his amendments with some care, and for giving me the answer to my amendment before I spoke to it—that saves time.
Obviously, we all understand the importance of transfers of personal data between law enforcement authorities, but perhaps the crux of this, and the one question in our mind is, what is—perhaps the Minister could remind us—the process for making sure that the country that we are sending it to is data adequate? Amendment 121 was tabled as a way of probing that. It would be extremely useful if the Minister can answer that. This should apply to transfers between law enforcement authorities just as much as it does for other, more general transfers under Schedule 5. If the Minister can give me the answer, that would be useful, but if he does not have the answer to hand, I am very happy to suspend my curiosity until after Easter.
My Lords, I too can be brief, having heard the Minister’s response. I thought he half-shot the Clement-Jones fox, with very good aim on the Minister’s part.
I was simply going to say that it is one in a sea of amendments from the Government, but the noble Lord, Lord Clement-Jones, made an important point about making sure that the country organisations that the commissioner looks at should meet the test of data adequacy—I also had that in my speaking note. The noble Lord, Lord Clement-Jones, was making a good point in terms of ensuring that appropriate data protections are in place internationally for us to be able to work with.
The Minister explained the government amendments with some care, but I wonder if he could explain how data transfers are made to an overseas processor using the powers relied on by reference to new Section 73(4)(aa) of the 2018 Act. The power is used as a condition and justification for several of the noble Lord’s amendments, and I wonder whether he has had to table these amendments because of the original drafting. That would seem to be to be the most likely reason.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Science, Innovation & Technology
(8 months, 1 week ago)
Grand CommitteeOnce more unto the breach, my Lords—as opposed to “my friends”.
I will also speak to Amendments 112 to 114, 116 and 130. New Article 45B(2) lists conditions that the Secretary of State must consider when deciding whether a third country provides an adequate level of protection for data subjects. It replaces the existing conditions in Article 45(2)(a) to (c) of the UK GDPR, removing important considerations such as the impact of a third country’s laws and practices in relation to national security, defence, public security, criminal law and public authority access to personal data on the level of protection provided to UK data subjects.
Despite this shorter list of conditions to consider, the Secretary of State is none the less required to be satisfied that a third country provides a level of protection that is not materially lower than the UK’s. It is plain that such an assessment cannot be made without considering the impact of these factors on the level of protection for UK data in a third country. It is therefore unclear why the amendment that the Government have made to Article 45 is necessary, beyond a desire for the Government to draw attention away from such contentious and complicated issues.
It may be that through rewriting Article 45 of the UK GDPR, the Government’s intention is that assimilated case law on international data transfers is no longer relevant. If that is the case, that would be a substantial risk for UK data adequacy. Importantly, new Article 45B(2) removes the reference to the need for an independent data protection regulator in the relevant jurisdiction. This, sadly, is consistent with the theme of diminishing the independence of the ICO, which is one of the major concerns in relation to the Bill, and it is also an area where the European Commission has expressed concern. The independence of the regulator is a key part of the EU data adequacy regime and is explicitly referenced in Article 8 of the Charter of Fundamental Rights, which guarantees the right to protection of personal data. Amendment 111 restores the original considerations that the Secretary of State must take into account.
Amendments 112 and 113 would remove the proposed powers in Schedules 5 and 6 of the Secretary of State to assess other countries’ suitability for international transfers of data, and place these on the new information commission instead. In the specific context of HIV—the provenance of these amendments is in the National AIDS Trust’s suggestions—it is unlikely that the Secretary of State or their departmental officials will have the specialist knowledge to assess whether there is a risk of harm to an individual by transferring data related to their HIV status to a third country. Given that the activities of government departments are political by their nature, the Secretary of State making these decisions related to the suitability of transfer to third countries may not be viewed as objective by individuals whose personal data is transferred. Many people living with HIV feel comfortable reporting breaches of data protection law in relation to their HIV status to the Information Commissioner’s Office due to its position as an independent regulator, so the National AIDS Trust and others recommend that the Bill places these regulatory powers on the new information commission created by the Bill instead, as this may inspire greater public confidence.
As regards Amendment 114, paragraph 5 of Schedule 5 should contain additional provisions to mandate annual review of the data protection test for each third country to which data is transferred internationally to ensure that the data protection regime in that third country is secure and that people’s personal data, such as their HIV status, will not be shared inappropriately. HIV is criminalised in many countries around the world, and the transfer to these countries of personal data such as an individual’s HIV status could put an individual living with HIV, their partner or their family members at real risk of harm. This is because HIV stigma is incredibly pronounced in many countries, which fosters a real risk of HIV-related violence. Amendment 114 would mandate this annual review.
As regards Amendment 116, new Article 47A(4) to (7) gives the Secretary of State a broad regulation-making power to designate new transfer mechanisms for personal data being sent to a third country in the absence of adequacy regulations. Controllers would be able to rely on these new mechanisms, alongside the existing mechanisms in Article 46 of the UK GDPR, to transfer data abroad. In order to designate new mechanisms, which could be based on mechanisms used in other jurisdictions, the Secretary of State must be satisfied that these are
“capable of securing that the data protection test set out in Article 46 is met”.
The Secretary of State must be satisfied that the transfer mechanism is capable of providing a level of protection for data subjects that is not materially lower than under the UK GDPR and the Data Protection Act. The Government have described this new regulation-making power as a way to future-proof the UK’s GDPR international transfers regime, but they have not been able to point to any transfer mechanisms in other countries that might be suitable to be recognised in UK law, and nor have they set out examples of how new transfer mechanisms might be created.
In addition to not having a clear rationale to take the power, it is not clear how the Secretary of State could be satisfied that a new mechanism is capable of providing the appropriate level of protection for data subjects. This test is meant to be a lower standard than the test for controllers seeking to rely on a transfer mechanism to transfer overseas, which requires them to consider that the mechanism provides the appropriate level of protection. It is not clear to us how the Secretary of State could be satisfied of a mechanism’s capability without having a clear sense of how it would be used by controllers in reality. That is the reason for Amendment 116.
As regards Amendment 130, Ministers have continued all the adequacy decisions that the EU had made in respect of third countries when the UK stopped being subject to EU treaties. The UK also conferred data adequacy on the EEA, but all this was done on a transitional basis. The Bill now seeks to continue those adequacy decisions, but no analysis appears to have been carried out as to whether these jurisdictions confer an adequate level of protection of personal data. This is not consistent with Section 17B(1) of the DPA 2018, which states that the Secretary of State must carry out a review of whether the relevant country that has been granted data adequacy continues to ensure an adequate level of protection, and that these reviews must be carried out at intervals of not more than four years.
In the EU, litigants have twice brought successful challenges against adequacy decisions. Those decisions were deemed unlawful and quashed by the European Court of Justice. It appears that this sort of challenge would not be possible in the UK because the adequacy decisions are being continued by the Bill and therefore through primary legislation. Any challenge to these adequacy decisions could result only in a declaration of incompatibility under the Human Rights Act; it could not be quashed by the UK courts. This is another example of how leaving the EU has diminished the rights of UK citizens compared with their EU counterparts.
As well as tabling those amendments, I support and have signed Amendment 115 in the names of the noble Lords, Lord Bethell and Lord Kirkhope, and I look forward to hearing their arguments in relation to it. In the meantime, I beg to move.
My Lords, I rise with some temerity. This is my first visit to this Committee to speak. I have popped in before and have been following it very carefully. The work going on here is enormously important.
I am speaking to Amendment 115, thanks to the indulgence of my noble friend Lord Bethell, who is the lead name on that amendment but has kindly suggested that I start the discussions. I also thank the noble Lord, Lord Clement-Jones, for his support. Amendment 115 has one clear objective and that is to prevent transfer of UK user data to jurisdictions where data rights cannot be enforced and there is no credible right of redress. The word “credible” is important in this amendment.
I thank my noble friend the Minister for his letter of 11 April, which he sent to us to try to mop up a number of issues. In particular, in one paragraph he referred to the question of adequacy, which may also touch on what the noble Lord, Lord Clement-Jones, has just said. The Secretary of State’s powers are also referred to, but I must ask: how, in a fast-moving or unique situation, can all the factors referred to in this long and comprehensive paragraph be considered?
The mechanisms of government and government departments must be thorough and in place to satisfactorily discharge what are, I think, somewhat grand intentions. I say that from a personal point of view, because I was one of those who drafted the European GDPR—another reason I am interested in discussing these matters today—and I was responsible for the adequacy decisions with third countries. The word “adequacy” matters very much in this group, in the same way that we were unable to use “adequacy” when we dealt with the United States and had to look at “equivalence”. Adequacy can work only if one is working to similar parameters. If one is constitutionally looking at different parameters, as is the case in the United States, then the word “equivalence” becomes much more relevant, because, although things cannot be quite the same in the way in which administration or regulation is carried out, if you have an equivalence situation, that can be acceptable and lead to an understanding of the adequacy which we are looking for in terms of others being involved.
I have a marvellous note here, which I am sure noble Lords have already talked about. It says that every day we generate 181 zettabytes of personal data. I am sure noble Lords are all aware of zettabytes, but I will clarify. One zettabyte is 1,000 exabytes—which perhaps makes it simpler to understand—or, if you like, 1 billion trillion bytes. One’s mind just has to get around this, but this is data on our movements, finances, health and families, from our cameras, phones, doorbells and, I am afraid, even from our refrigerators—though Lady Kirkhope refuses point blank to have any kind of detector on her fridge door that will tell anybody anything about us or what we eat. Increasingly, it is also data from our cars. Our every moment is recorded—information relating to everything from shopping preferences to personal fitness to our anxieties, even, as they are displayed or discussed. It is stored by companies that we entrust with that data and we have a right to expect that such sensitive and private data will be protected. Indeed, one of the core principles of data protection, as we all know, is accountability.
Article 79 of the UK GDPR and Section 167 of our Data Protection Act 2018 provide that UK users must have the right to effective judicial remedy in the event of a data protection breach. Article 79 says that
“each data subject shall have the right to an effective judicial remedy where he or she considers that his or her rights under this Regulation have been infringed as a result of the processing of his or her personal data in non-compliance with this Regulation”.
I welcome the Committee back after what I hope was a good Easter break for everybody. I thank all those noble Lords who, as ever, have spoken so powerfully in this debate.
I turn to Amendments 111 to 116 and 130. I thank noble Lords for their proposed amendments relating both to Schedule 5, which reforms the UK’s general processing regime for transferring personal data internationally and consolidates the relevant provisions in Chapter 5 of the UK GDPR, and to Schedule 7, which introduces consequential and transitional provisions associated with the reforms.
Amendment 111 seeks to revert to the current list of factors under the UK GDPR that the Secretary of State must consider when making data bridges. With respect, this more detailed list is not necessary as the Secretary of State must be satisfied that the standard of protection in the other country, viewed as a whole, is not materially lower than the standard of protection in the UK. Our new list of key factors is non-exhaustive. The UK courts will continue to be entitled to have regard to CJEU judgments if they choose to do so; ultimately, it will be for them to decide how much regard to have to any CJEU judgment on a similar matter.
I completely understand the strength of noble Lords’ concerns about ensuring that our EU adequacy decisions are maintained. This is also a priority for the UK Government, as I and my fellow Ministers have repeatedly made clear in public and on the Floor of the House. The UK is firmly committed to maintaining high data protection standards, now and in future. Protecting the privacy of individuals will continue to be a national priority. We will continue to operate a high-quality regime that promotes growth and innovation and underpins the trustworthy use of data.
Our reforms are underpinned by this commitment. We believe they are compatible with maintaining our data adequacy decisions from the EU. We have maintained a positive, ongoing dialogue with the EU to make sure that our reforms are understood. We will continue to engage with the European Commission at official and ministerial levels with a view to ensuring that our respective arrangements for the free flow of personal data can remain in place, which is in the best interests of both the UK and the EU.
We understand that Amendments 112 to 114 relate to representations made by the National AIDS Trust concerning the level of protection for special category data such as health data. We agree that the protection of people’s HIV status is vital. It is right that this is subject to extra protection, as is the case for all health data and special category data. As I have said before this Committee previously, we have met the National AIDS Trust to discuss the best solutions to the problems it has raised. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press these amendments.
Can the Minister just recap? He said that he met the trust then swiftly moved on without saying what solution he is proposing. Would he like to repeat that, or at least lift the veil slightly?
The point I was making was only that we have met with it and will continue to do so in order to identify the best possible way to keep that critical data safe.
The Minister is not suggesting a solution at the moment. Is it in the “too difficult” box?
I doubt that it will be too difficult, but identifying and implementing the correct solution is the goal that we are pursuing, alongside our colleagues at the National AIDS Trust.
I am sorry to keep interrogating the Minister, but that is quite an admission. The Minister says that there is a real problem, which is under discussion with the National AIDS Trust. At the moment the Government are proposing a significant amendment to both the GDPR and the DPA, and in this Committee they are not able to say that they have any kind of solution to the problem that has been identified. That is quite something.
I am not sure I accept that it is “quite something”, in the noble Lord’s words. As and when the appropriate solution emerges, we will bring it forward—no doubt between Committee and Report.
On Amendment 115, we share the noble Lords’ feelings on the importance of redress for data subjects. That is why the Secretary of State must already consider the arrangements for redress for data subjects when making a data bridge. There is already an obligation for the Secretary of State to consult the ICO on these regulations. Similarly, when considering whether the data protection test is met before making a transfer subject to appropriate safeguards using Article 46, the Government expect that data exporters will also give consideration to relevant enforceable data subject rights and effective legal remedies for data subjects.
Our rules mean that companies that transfer UK personal data must uphold the high data protection standards we expect in this country. Otherwise, they face action from the ICO, which has powers to conduct investigations, issue fines and compel companies to take corrective action if they fail to comply. We will continue to monitor and mitigate a wide range of data security risks, regardless of provenance. If there is evidence of threats to our data, we will not hesitate to take the necessary action to protect our national security.
My Lords, we heard from the two noble Lords some concrete examples of where those data breaches are already occurring, and it does not appear to me that appropriate action has been taken. There seems to be a mismatch between what the Minister is saying about the processes and the day-to-day reality of what is happening now. That is our concern, and it is not clear how the Government are going to address it.
My Lords, in a way the Minister is acknowledging that there is a watering down taking place, yet the Government seem fairly relaxed about seeing these issues. If something happens, the Government will do something or other, or the commissioner will. But the Government are proposing to water down Article 45, and that is the essence of what we are all talking about here. We are not satisfied with the current position, and watering down Article 45 will make it even worse; there will be more Yandexes.
As I said, a number of important points were raised there. First, I would not categorise the changes to Article 45 as watering down—they are intended to better focus the work of the ICO. Secondly, the important points raised with respect to Amendment 115 are points primarily relating to enforcement, and I will write to noble Lords setting out examples of where that enforcement has happened. I stress that the ICO is, as noble Lords have mentioned, an independent regulator that conducts the enforcement of this itself. What was described—I cannot judge for sure—certainly sounded like completely illegal infringements on the data privacy of those subjects. I am happy to look further into that and to write to noble Lords.
Amendment 116 seeks to remove a power allowing the Secretary of State to make regulations recognising additional transfer mechanisms. This power is necessary for the Government to react quickly to global trends and to ensure that UK businesses trading internationally are not held back. Furthermore, before using this power, the Secretary of State must be satisfied that the transfer mechanism is capable of meeting the new Article 46 data protection test. They are also required to consult with the Information Commissioner and such other persons felt appropriate. The affirmative resolution procedure will also ensure appropriate parliamentary scrutiny.
I reiterate that the UK Government’s assessment of the reforms in the Bill is that they are compatible with maintaining adequacy. We have been proactively engaging with the European Commission since the start of the Bill’s consultation process to ensure that it understands our reforms and that we have a positive, constructive relationship. Noble Lords will appreciate that it is important that officials have the ability to conduct candid discussions during the policy-making process. However, I would like to reassure noble Lords once again that the UK Government take the matter of retaining our adequacy decisions very seriously.
Finally, Amendment 130 pertains to EU exit transitional provisions in Schedule 21 to the Data Protection Act 2018, which provide that certain countries are currently deemed as adequate. These countries include the EU and EEA member states and those countries that the EU had found adequate at the time of the UK’s exit from the EU. Such countries are, and will continue to be, subject to ongoing monitoring. As is the case now, if the Secretary of State becomes aware of developments such as changes to legislation or specific practices that negatively impact data protection standards, the UK Government will engage with the relevant authorities and, where necessary, amend or revoke data bridge arrangements.
For these reasons, I hope noble Lords will not press their amendments.
My Lords, I thank the Minister for his response, but I am still absolutely baffled as to why the Government are doing what they are doing on Article 45. The Minister has not given any particular rationale. He has given a bit of a rationale for resisting the amendments, many of which try to make sure that Article 45 is fully effective, that these international transfers are properly scrutinised and that we remain data adequate.
By the way, I thought the noble Lord, Lord Kirkhope, made a splendid entry into our debate, so I hope that he stays on for a number of further amendments—what a début.
The only point on which I disagreed with the noble Lord, Lord Bethell—as the noble Baroness, Lady Jones, said—was when he said that this is a terrific Bill. It is a terrifying Bill, not a terrific one, as we have debated. There are so many worrying aspects—for example, that there is no solution yet for sensitive special category data and the whole issue of these contractual clauses. The Government seem almost to be saying that it is up to the companies to assess all this and whether a country in which they are doing business is data adequate. That cannot be right. They seem to be abrogating their responsibility for no good reason. What is the motive? Is it because they are so enthusiastic about transfer of data to other countries for business purposes that they are ignoring the rights of data subjects?
The Minister resisted describing this as watering down. Why get rid of the list of considerations that the Secretary of State needs to have so that they are just in the mix as something that may or may not be taken into consideration? In the existing article they are specified. It is quite a long list and the Government have chopped it back. What is the motive for that? It looks like data subjects’ rights are being curtailed. We were baffled by previous elements that the Government have introduced into the Bill, but this is probably the most baffling of all because of the real importance of this—its national security implications and the existing examples, such as Yandex, that we heard about from the noble Lord, Lord Kirkhope.
Of course we understand that there are nuances and that there is a difference between adequacy and equivalence. We have to be pragmatic sometimes, but the question of whether these countries having data transferred to them are adequate must be based on principle. This seems to me a prime candidate for Report. I am sure we will come back to it, but in the meantime I beg leave to withdraw.
My Lords, I support Amendment 135 in the name of the noble Lord, Lord Bethell, to which I have added my name. He set out our struggle during the passage of the Online Safety Bill, when we made several attempts to get something along these lines into the Bill. It is worth actually quoting the Minister, Paul Scully, who said at the Dispatch Box in the other place:
“we have made a commitment to explore this … further and report back to the House in due course on whether further measures to support researcher access to data are required and, if so, whether they could also be implemented through other legislation such as the Data Protection and Digital Information Bill”.—[Official Report, Commons, 12/9/23; col. 806.]
When the Minister responds, perhaps he could update the House on that commitment and explain why the Government decided not to address it in the Bill. Although the Bill proposes a lessening of the protections on the use of personal data for research done by commercial companies, including the development of products and marketing, it does nothing to enable public interest research.
I would like to add to the list that the noble Lord, Lord Bethell, started, because as well as Melanie Dawes, the CEO of Ofcom, so too the United States National Academy of Sciences, the Lancet commission, the UN advisory body on AI, the US Surgeon General, the Broadband Commission and the Australian eSafety Commissioner have all in the last few months called for greater access to independent research.
I ask the noble Viscount to explain the Government’s thinking in detail, and I really do hope that we do not get more “wait and see”, because it does not meet the need. We have already passed online safety legislation that requires evidence, and by denying access to independent researchers, we have a perverse situation in which the regulator has to turn to the companies it is regulating for the evidence to create their codes, which, as the noble Viscount will appreciate, is a formula for the tech companies to control the flow of evidence and unduly temper the intent of the legislation. I wish to make most of my remarks on that subject.
In Ofcom’s consultation on its illegal harms code, the disparity between the harms identified and Ofcom’s proposed code caused deep concern. Volume 4 states the following at paragraph 14.12 in relation to content moderation:
“We are not proposing to recommend some measures which may be effective in reducing risks of harm. This is principally due to currently limited evidence”.
Further reading of volume 4 confirms that the lack of evidence is the given reason for failing to recommend measures across a number of harms. Ofcom has identified harms for which it does not require mitigation. This is not what Parliament intended and spectacularly fails to deliver on the promises made by Ministers. Ofcom can use its information-gathering powers to build evidence on the efficacy required to take a bolder approach to measures but, although that is welcome, it is unsatisfactory for many reasons.
First, given the interconnectedness between privacy, safety, security and competition, regulatory standards cannot be developed in silo. We have a thriving academic community that can work across different risks and identify solutions across different parts of the tech ecosystem.
Secondly, a regulatory framework in which standards are determined exclusively through private dialogue between the regulator and the regulated does not have the necessary transparency and accountability to win public trust.
Thirdly, regulators are overstretched and under-resourced. Our academics stand ready and willing to work in the public interest and in accordance with the highest ethical standards in order to scrutinise and understand the data held so very closely by tech companies, but they need a legal basis to demand access.
Fourthly, if we are to maintain our academic institutions in a post-Brexit world, we need to offer UK academics the same support as those in Europe. Article 40(4) of the European Union’s Digital Services Act requires platforms to
“provide access to data to vetted researchers”
seeking to carry out
“research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35”.
It will be a considerable loss to the UK academic sector if its European colleagues have access to data that it does not.
Fifthly, by insisting on evidence but not creating a critical pathway to secure it, the Government have created a situation in which the lack of evidence could mean that Ofcom’s codes are fixed at what the tech companies tell it is possible in spring 2024, and will always be backward-looking. There is considerable whistleblower evidence revealing measures that the companies could have taken but chose not to.
I have considerable personal experience of this. For example, it was nearly a decade ago that I told Facebook that direct messaging on children’s accounts was dangerous, yet only now are we beginning to see regulation reflecting that blindingly obvious fact. That is nearly a decade in which something could have been done by the company but was not, and of which the regulator will have no evidence.
Finally, as we discussed on day one in Committee, the Government have made it easier for commercial companies to use personal data for research by lowering the bar for the collection of data and expanding the concept of research, further building the asymmetry that has been mentioned in every group of amendments we have debated thus far. It may not be very parliamentary language, but it is crazy to pass legislation and then obstruct its implementation by insisting on evidence that you have made it impossible to gather.
I would be grateful if the Minister could answer the following questions when he responds. Is it the Government’s intention that Ofcom codes be based entirely on the current practice of tech companies and that the regulator can demand only mitigations that exist currently, as evidenced by those companies? Do the Government agree that whistleblowers, NGO experts and evidence from user experience can be taken by regulators as evidence of what could or should be done? What route do the Government advise Ofcom to take to mitigate identified risks for which there are no current measures in place? For example, should Ofcom describe the required outcome and leave it to the companies to determine how they mitigate the risk, should it suggest mitigations that have been developed but not tried—or is the real outcome of the OSA to identify risk and leave that risk in place?
Do the Government accept that EU research done under the auspices of the DSA should be automatically considered as an adequate basis for UK regulators where the concerns overlap with UK law? Will the new measures announced for testing and sandboxing of AI models allow for independent research, in which academics, independent of government or tech, will have access to data? Finally, what measures will the Government take to mitigate the impact on universities of a brain drain of academics to Europe, if we do not provide equivalent legislative support to enable them to access the data required to study online safety and privacy? If the Minister is unable to answer me from the Dispatch Box, perhaps he will agree to write to me and place his letter in the Library for other noble Lords to read.
My Lords, there is little for me to say. The noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, have left no stone unturned in this debate. They introduced this amendment superbly, and I pay tribute to them and to Reset, which was with us all the way through the discussions on online harms at the Joint Committee on the draft Online Safety Bill, advocating for these important provisions.
As the noble Lord, Lord Bethell, said, there is a strong body of opinion out there. Insight from what might be called approved independent researchers would enable policy-making and regulatory innovation to keep pace with emerging trends and threats, which can span individual harms, matters of public safety and even national security. We have seen the kinds of harms taking place in social media, and it is absolutely vital that we understand what is happening under the bonnet of social media. It is crucial in detecting, identifying and understanding the systemic risks of online harms and non-compliance with law.
When we discussed the Online Safety Bill, it was a question of not just content but functionality. That was one of the key things. An awful lot of this research relates to that: how algorithms operate in amplifying content and some of the harms taking place on social media. The noble Lord, Lord Bethell, referred to X closing its API for researchers and Meta’s move to shut CrowdTangle. We are going into reverse, whereas we should be moving forward in a much more positive way. When the Online Safety Bill was discussed, we got the review from Ofcom, but we did not get the backup—the legislative power for Ofcom or the ICO to be able to authorise and accredit researchers to carry out the necessary research.
The Government’s response to date has been extremely disappointing, given the history behind this and the pressure and importance of this issue. This dates from discussions some way back, even before the Joint Committee met and heard the case for this kind of researcher access. This Bill is now the best vehicle by which to introduce a proper regime on access for researchers. As the noble Baroness, Lady Kidron, asked, why, having had ministerial assurances, are we not seeing further progress? Are we just going to wait until Ofcom produces its review, which will be at the tail end of a huge programme of work which it has to carry out in order to implement the Online Safety Act?
That was a very good conclusion to the response from the noble Lord, Lord Bethell—urging a Minister to lean in. I have not heard that expression used in the House before, but it is excellent because, faced with a Home Office Minister, I am sure that is the kind of behaviour that we can expect imminently.
Last time we debated issues relating to national security and data protection, the noble Lord, Lord Ashton, was the responsible Minister and I had the support of the noble Lord, Lord Paddick. Now I have the Minister all to myself on Amendments 135A to 135E and the stand part notices on Clauses 28 to 30. These Benches believe that, as drafted, these clauses fall foul of the UK’s obligations under the ECHR, because they give the Home Secretary too broad a discretion and do not create sufficient safeguards to prevent their misuse.
Under the case law of the European Court of Human Rights, laws that give unfettered or overly broad discretion to the Government to interfere with privacy will violate the convention, because the laws must be sufficiently specific to prevent abuses of power. This means they must make sure that, any time they interfere with the privacy of people in the UK, they obey the law, have a goal that is legitimate in a democratic society and do only what is truly necessary to achieving that goal. The court has repeatedly stressed that this is what the rule of law means; it is an essential principle of democracy.
Despite multiple requests from MPs, and from Rights and Security International in particular, the Government have also failed to explain why they believe that these clauses are necessary to safeguard national security. So far, they have explained only why these new powers would be “helpful” or would ensure “greater efficiency”. Those justifications do not meet the standard that the ECHR requires when the Government want to interfere with our privacy. They are not entitled to do just anything that they find helpful.
Under Clause 28(7), the Home Secretary would be able to issue a national security certificate to tell the police that they do not need to comply with many important data protection laws and rules that they would otherwise have to obey. For instance, a national security certificate would give the police immunity when they commit crimes by using personal data illegally. It would also exempt them from certain provisions of the Freedom of Information Act 2000. The Bill would expand what counts as an intelligence service for the purposes of data protection law—again, at the Home Secretary’s wish. Clause 29 would allow the Home Secretary to issue a designation notice, allowing law enforcement bodies to take advantage of the more relaxed rules in the Data Protection Act 2018, otherwise designed for the intelligence agencies whenever they collaborate with the security services.
Both the amended approach to national security certificates and the new designation notice regime would be unaccountable. The courts would not be able to review what the Government are doing and Parliament might therefore never find out. National security certificates are unchallengeable before the courts, meaning that the police and the Home Secretary would be unaccountable if they abused those powers. If the Home Secretary says that the police need to use these increased—and, in our view, unnecessary—powers in relation to national security, his word will be final. This includes the power to commit crimes.
As regards designation notices, the Home Secretary is responsible for approving and reviewing their use. Only a person who is directly affected by a designation notice will be able to challenge it, yet the Home Secretary would have the power to keep the notice secret, in which case how could anybody know that the police had been snooping on their lives under this law?
Clauses 28 to 30 could, in our view, further violate the UK’s obligations under the Human Rights Act 1998 and the European Convention on Human Rights because they remove the courts’ role in reviewing how the Government use their surveillance power. The European Court of Human Rights has ruled in the past that large aspects of the law previously governing the UK’s surveillance powers were unlawful because they gave the Government too much discretion and lacked important safeguards to prevent misuse. Clauses 28 to 30 could be challenged on similar grounds, and the court has shown that it is willing to rule on these issues. These weaknesses in the law could also harm important relationships that the UK has with the EU as regards data adequacy, a subject that we will no doubt discuss in further depth later this week.
The Government argue that the clauses create a simplified legal framework that would improve the efficiency of police operations when working with the intelligence services. This is far from meeting the necessity standard under the ECHR.
The Government have frequently used the Fishmongers’ Hall and Manchester Arena attacks to support the idea that Clauses 28 to 30 are desirable. However, a difference in data protection regimes was not the issue in either case; instead, the problem centred around failures in offender management, along with a lack of communication between the intelligence services and local police. The Government have not explained how Clauses 28 to 30 would have prevented either incident or why they think these clauses are necessary to prevent whatever forms of violence the Government regard as most likely to occur in the future. The Government have had sufficient opportunity to date to explain the rationale for these clauses, yet they have so far failed to do so. For these reasons, we are of the view that Clauses 28 to 30 should not stand part of the Bill.
However, it is also worth putting down amendments to try to tease out additional aspects of these clauses, so Amendments 135A and 135D would put proportionality back in. It is not clear why the word “proportionality” has been taken out of the existing legislation. Similarly, Amendment 135B attempts to put back in the principles that should underpin decisions. Those are the most troubling changes, since they seem to allow for departure from basic data protection principles. These were the principles that the Government, during the passage of the Data Protection Act 2018, assured Parliament would always be secure. The noble Lord, Lord Ashton of Hyde, said:
“People will always have the right to ensure that the data held about them is fair and accurate, and consistent with the data protection principles”.—[Official Report, 10/10/17; col. 126.]
Thirdly, on the introduction of oversight by a judicial commissioner for Clause 28 certificates, now seems a good time to do that. During the passage of the Data Protection Act through Parliament, there was much debate over the Part 2 national security exemption for general processing in Section 26 and the national security certificates in Section 27. We expressed concern then but, sadly, the judicial commissioner role was not included. This is a timely moment to suggest that again.
Finally, on increasing the oversight of the Information Commissioner under Amendment 135E, I hope that this will be an opportunity for the Minister, despite the fact that I would prefer to see Clauses 28 to 30 not form part of the Bill, to explain in greater detail why they are constructed in the way they are and why the Home Office believes that it needs to amend the legislation in the way it proposes. I beg to move.
My Lords, I come to this topic rather late and without the star quality in this area that has today been attributed to the noble Lord, Lord Kirkhope. I acknowledge both the work of Justice in helping me to understand what Clause 28 does and the work of the noble Lord, Lord Clement-Jones, in formulating the probing amendments in this group. I echo his questions on Clause 28. I will focus on a few specific matters.
First, what is the difference between the existing formulation for restricting data protection rights “when necessary and proportionate” to protect national security and the new formulation,
“when required to safeguard national security”?
What is the purpose of that change? Does “required” mean the same as “necessary” or something different? Do the restrictions not need to be proportionate any more? If so, why? Could we have a practical example of what the change is likely to mean in practice?
Secondly, why is it necessary to expand the number of rights and obligations from which competent law enforcement authorities can be exempted for reasons of national security? I can understand why it may for national security reasons be necessary to restrict a person’s right to be informed, right of access to data or right to be notified of a data breach, as under the existing law, but Clause 28 would allow the disapplication of some very basic principles of data protection law—including, as I understand it, the right to have your data processed only for a specified, explicit and legitimate purpose, as well as the right to have decisions made about you not use solely automated methods.
Thirdly, as the noble Lord, Lord Clement-Jones, asked, why is it necessary to remove the powers of the Information Commissioner to investigate, to enter and inspect, and, where necessary, to issue notices? I appreciate that certificates will remain appealable to the Upper Tribunal by the person directly affected, applying judicial review principles, but that is surely not a substitute for review by the skilled and experienced ICO. Apart from anything else, the subject is unlikely even to know that they have been affected by the provisions, given that a certificate would exempt law enforcement from having to provide information to them. That is precisely why the oversight of a commissioner in the national security area is so important.
As for Clauses 29 and 30, I am as keen as anybody to improve the capabilities for the joint processing of data by the police and intelligence agencies. That was a major theme of the learning points from the London and Manchester attacks of 2017, which I helped to formulate in that year and on which I reported publicly in 2019. A joint processing regime certainly sounds like a good idea in principle but I would be grateful if the Minister could confirm which law enforcement competent authorities will be subject to this new regime. Are they limited to Counter Terrorism Policing and the National Crime Agency?
Perhaps the same correspondence could cover the point I raised as well.
My Lords, I am immensely grateful to the noble Lords, Lord Anderson and Lord Bassam, for their interventions. In particular, given his background, if the noble Lord, Lord Anderson, has concerns about these clauses, we all ought to have concerns. I am grateful to the Minister for the extent of his unpacking—or attempted unpacking—of these clauses but I feel that we are on a slippery slope here. I feel some considerable unease about the widening of the disapplication of principles that we were assured were immutable only six years ago. I am worried about that.
We have had some reassurance about the right to transparency, perhaps when it is convenient that data subjects find out about what is happening. The right to challenge was also mentioned by the Minister but he has not really answered the question about whether the Home Office has looked seriously at the implications as far as the human rights convention is concerned, which is the reason for the stand part notice. The Minister did not address that matter at all; I do not know why. I am assuming that the Home Office has looked at the clauses in the light of the convention but, again, he did not talk about that.
The only assurance the Minister has really given is that it is all on a case-by-case basis. I do not think that that is much of a reassurance. On the proportionality point made by the noble Lord, Lord Anderson, I think that we are going to be agog in waiting for the Minister’s correspondence on that, but it is such a basic issue. There were two amendments specifically on proportionality but we have not really had a reply on that issue at all, in terms of why it should have been eliminated by the legislation. So a feeling of unease prevails. I do not even feel that the Minister has unpacked fully the issue of joint working; I think that the noble Lord, Lord Anderson, did that more. We need to know more about how that will operate.
The final point that the Minister made gave even greater concern—to think that there will be an SI setting out the bodies that will have the powers. We are probably slightly wiser than when we started out with this group of amendments, but only slightly and we are considerably more concerned. In the meantime, I beg leave to withdraw the amendment.
My Lords, the noble Baroness, Lady Morgan, has done us a service by raising this issue. My question is about whether the advice given to date about redaction is accurate. I have not seen the Home Office’s guidance or counsel’s analysis. I have taken advice on the Police Federation’s case—I received an email and I was very interested in what it had to say, because we all want to make sure that the bureaucracy involved in charging and dealing with the CPS is as minimal as possible within the bounds of data protection law.
Section 35(2)(b) of the Data Protection Act simply requires the police to ensure that their processing is necessary for the performance of their tasks. You would have thought that sending an investigation file to the CPS to decide whether to charge a suspect seems necessary for the performance of that task. Some of that personal data may end up not being relevant to the charge or any trial, but that is a judgment for the CPS and the prosecutor. It does not mean, in the view of those I have consulted, that the file has to be redacted at vast taxpayer cost before the CPS or prosecutor have had a chance to see the investigation’s file. When you look at sensitive data, the test is “strictly necessary”, which is a higher test, but surely the answer to that must be that officers should collect this information only where they consider it relevant to the case. So this can be dealt with through protocols about data protection, which ensure that officers do not collect more sensitive data than is necessary for the purposes of the investigation.
Similarly, under Section 37, the question that the personal data must be adequate, relevant and not excessive in relation to the purpose for which it is processed should not be interpreted in such a way that this redaction exercise is required. If an officer thinks they need to collect the relevant information for the purpose of the investigation, that seems to me—and to those advising me—in broad terms to be sufficient to comply with the principle. Conversely, if officers are collecting too much data, the answer is that they should be trained to avoid doing this. If officers really are collecting more information than they should be, redactions cannot remedy the fact that the collection was unlawful in the first place. The solution seems to be to stop them collecting that data.
I assume—maybe I am completely wrong—that the Minister will utter “suitable guidance” in response to the noble Baroness’s amendment and say that there is no need to amend the legislation, but, if there is no need to do so, I hope that they revise the guidance, because the Police Federation and its members are clearly labouring under a misapprehension about the way the Act should be interpreted. It would be quite a serious matter if that has taken place for the last six years.
My Lords, we should be very grateful to the noble Baroness, Lady Morgan of Cotes, for her amendment. I listened very carefully to her line of argument and find much that we can support in the approach. In that context, we should also thank the Police Federation of England and Wales for a particularly useful and enlightening briefing paper.
We may well be suffering under the law of unintended consequences in this context; it seems to have hit quite hard and acted as a barrier to the sensible processing and transfer of data between two parts of the law enforcement machinery. It is quite interesting coming off the back of the previous debate, when we were discussing making the transfer of information and intelligence between different agencies easier and having a common approach. It is a very relevant discussion to have.
I do not think that the legislation, when it was originally drafted, could ever have been intended to work in the way the Police Federation has set out. The implementation of the Data Protection Act 2018, in so far as law enforcement agencies are concerned, is supposed to be guided by recital 4, which the noble Baroness read into the record and which makes good sense.
As the noble Baroness explained, the Police Federation’s argument that the DPA makes no provisions at all that are designed to facilitate, in effect, the free flow of information, that it should be able to hold all the relevant data prior to the charging decision being made by the CPS, and that redaction should take place only after a decision on charging has been made seems quite a sensible approach. As she argued, it would significantly lighten the burden on police investigating teams and enable the decision on charging to be more broadly informed.
So this is a piece of simplification that we can all support. The case has been made very well. If it helps speed up charging and policing processes, which I know the Government are very concerned about, as all Governments should be, it seems a sensible move—but this is the Home Office. We do not always expect the most sensible things to be delivered by that department, but we hope that they are.
I thank all noble Lords for their contributions—I think. I thank my noble friend Lady Morgan of Cotes for her amendment and for raising what is an important issue. Amendment 137 seeks to permit the police and the Crown Prosecution Service to share unredacted data with one another when making a charging decision. Perhaps to the surprise of the noble Lord, Lord Bassam, we agree: we must reduce the burden of redaction on the police. As my noble friend noted, this is very substantial and costly.
We welcome the intent of the amendment. However, as my noble friend has noted, we do not believe that, as drafted, it would achieve the stated aim. To fully remove it would require the amendment of more than just the Data Protection Act.
However, the Government are committed to reducing the burden on the police, but it is important that we get it right and that the solution is comprehensive. We consider that the objective which my noble friend is seeking would be better achieved through other means, including improved technology and new, simplified guidance to prevent overredaction, as all speakers, including the noble Lord, Lord Clement-Jones, noted.
The Home Office provided £960,000 of funding for text and audio-visual multimedia redaction in the 2023-24 financial year. Thanks to that funding, police forces have been able to procure automated text redaction tools, the trials of which have demonstrated that they could save up 80% of the time spent by the police on this redaction. Furthermore, in the latest Budget, the Chancellor announced an additional £230 million of funding for technology to boost police productivity. This will be used to develop, test and roll out automated audio-visual redaction tools, saving thousands more hours of police time. I would say to my noble friend that, as the technology improves, we hope that the need for it to be supervised by individuals will diminish.
I can also tell your Lordships’ House that officials from the Home Office have consulted with the Information Commissioner’s Office and have agreed that a significant proportion of the burden caused by existing pre-charge redaction processes could be reduced safely and lawfully within the current data protection framework in a way that will maintain standards and protections for individuals. We are, therefore, actively working to tackle this issue in the most appropriate way by exploring how we can significantly reduce the redaction burden at the pre-charge stage through process change within the existing legislative framework. This will involve creating simplified guidance and, obviously, the use of better technology.
Is the Minister almost agreeing with some of my analysis in that case?
No, I think I was agreeing with my noble friend’s analysis.
I thank all noble Lords for their contributions. We acknowledge this particular problem and we are working to fix it. I would ask my noble friend to withdraw her amendment.
My Lords, I will speak also to Amendment 140 and the submissions that Clauses 32 to 35 should not stand part. These amendments are designed to clarify the statutory objective of the new information commission; increase its arm’s-length relationship with the Government; allow effective judicial scrutiny of its regulatory function; allow not-for-profit organisations to lodge representative complaints; retain the Office of the Biometrics and Surveillance Camera Commissioner; and empower the Equality and Human Rights Commission to scrutinise the new information commission. The effective supervision and enforcement of data protection and the investigation and detection of offenders are crucial to achieve deterrence, prevent violations, maintain transparency and control options for redress against data misuse.
I am very happy to try to find a way forward on this. Let me think about how best to take this forward.
My Lords, I thank the Minister for his response and, in particular, for that exchange. There is a bit of a contrast here—the mood of the Committee is probably to go with the grain of these clauses and to see whether they can be improved, rather than throw out the idea of an information commission and revert to the ICO on the basis that perhaps the information commission is a more logical way of setting up a regulator. I am not sure that I personally agree, but I understand the reservations of the noble Baroness, Lady Jones, and I welcome her support on the aspect of the Secretary of State power.
We keep being reassured by the Minister, in all sorts of different ways. I am sure that the spirit is willing, but whether it is all in black and white is the big question. Where are the real safeguards? The proposals in this group from the noble Baroness, Lady Kidron, to which she has spoken to so well, along with the noble Baroness, Lady Harding, are very modest, to use the phrase from the noble Baroness, Lady Kidron. I hope those discussions will take place because they fit entirely with the architecture of the Bill, which the Government have set out, and it would be a huge reassurance to those who believe that the Bill is watering down data subject rights and is not strengthening children’s rights.
I am less reassured by other aspects of what the Minister had to say, particularly about the Secretary of State’s powers in relation to the codes. As the noble Baroness, Lady Kidron, said, we had a lot of discussion about that in relation to the Ofcom codes, under the Online Safety Bill, and I do not think we got very far on that either. Nevertheless, there is disquiet about whether the Secretary of State should have those powers. The Minister said that the ICO is not required to act in accordance with the advice of the Secretary of State so perhaps the Minister has provided a chink of light. In the meantime, I beg leave to withdraw the amendment.
My Lords, I rise once again in my Robin role to support the noble Baroness, Lady Kidron, on this amendment. We had a debate on 23 November last year that the noble Baroness brought on this very issue of edtech. Rather than repeat all the points that were made in that very useful debate, I point my noble friend the Minister to it.
I would just like to highlight a couple of quick points. First, in supporting this amendment, I am not anti-edtech in any way, shape or form. It is absolutely clear that technology can bring huge benefits to students of all ages but it is also clear that education is not unique. It is exactly like every other part of society: where technology brings benefit, it also brings substantial risk. We are learning the hard way that thinking that any element of society can mitigate the risks of technology without legal guard-rails is a mistake.
We have seen really clearly with the age-appropriate design code that commercial organisations operating under its purview changed the way they protected children’s data as a result of that code. The absence of the equivalent code for the edtech sector should show us clearly that we will not have had those same benefits. If we bring edtech into scope, either through this amendment or simply through extending the age-appropriate design code, I would hazard a strong guess that we would start to see very real improvements in the protection of children’s data.
In the debate on 23 November, I asked my noble friend the Minister, the noble Baroness, Lady Barran, why the age-appropriate design code did not include education. I am not an expert in education, by any stretch of the imagination. The answer I received was that it was okay because the keeping children safe in education framework covered edtech. Since that debate, I have had a chance to read that framework, and I cannot find a section in it that specifically addresses children’s data. There is lots of really important stuff in it, but there is no clearly signposted section in that regard. So even if all the work fell on schools, that framework on its own, as published on GOV.UK, does not seem to meet the standards of a framework for data protection for children in education. However, as the noble Baroness, Lady Kidron, said, this is not just about schools’ responsibility but the edtech companies’ responsibility, and it is clear that there is no section on that in the keeping children safe in education framework either.
The answer that we received last year in this House does not do justice to the real question: in the absence of a specific code—the age-appropriate design code or a specific edtech code—how can we be confident that there really are the guardrails, which we know we need to put in place in every sector, in this most precious and important sector, which is where we teach our children?
My Lords, I am absolutely delighted to be able to support this amendment. Like the noble Baroness, Lady Harding, I am not anti-edtech at all. I did not take part in the debate last year. When I listen to the noble Baroness, Lady Kidron, and even having had the excellent A Blueprint for Education Data from the 5Rights Foundation and the Digital Futures for Children brief in support of a code of practice for education technology, I submit that it is chilling to hear what is happening as we speak with edtech in terms of extraction of data and not complying properly with data protection.
I got involved some years ago with the advisory board of the Institute for Ethical AI in Education, which Sir Anthony Seldon set up with Professor Rose Luckin and Priya Lakhani. Our intention was slightly broader—it was designed to create a framework for the use of AI specifically in education. Of course, one of the very important elements was the use of data, and the safe use of data, both by those procuring AI systems and by those developing them and selling them into schools. That was in 2020 and 2021, and we have not moved nearly far enough since that time. Obviously, this is data specific, because we are talking about the data protection Bill, but what is being proposed here would cure some of the issues that are staring us in the face.
As we have been briefed by Digital Futures for Children, and as the noble Baroness, Lady Kidron, emphasised, there is widespread invasion of children’s privacy in data collection. Sometimes there is little evidence to support the claimed learning benefits, while schools and parents lack the technical and legal expertise to understand what data is collected. As has been emphasised throughout the passage of this Bill, children deserve the highest standards of privacy and data protection—especially in education, of course.
From this direction, I wholly support what the noble Baroness, Lady Kidron, is proposing, so well supported by the noble Baroness, Lady Harding. Given that it again appears that the Government gave an undertaking to bring forward a suitable code of practice but have not done so, there is double reason to want to move forward on this during the passage of the Bill. We very much support Amendment 146 on that basis.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Science, Innovation & Technology
(8 months, 1 week ago)
Grand CommitteeMy Lords, I start today with probably the most innocuous of the amendments, which is that Clause 44 should not stand part. Others are more significant, but its purpose, if one can describe it as such, is as a probing clause stand part, to see whether the Minister can explain the real motive and impact of new Section 164A, which is inserted by Clause 44. As the explanatory statement says, it appears to hinder
“data subjects’ right to lodge complaints, and extends the scope of orders under Section 166 of the Data Protection Act to the appropriateness of the Commissioner’s response to a complaint.”
I am looking to the Minister to see whether he can unpack the reasons for that and what the impact is on data subjects’ rights.
More fundamental is Amendment 153, which relates to Clause 45. This provision inserts new Section 165A into the Data Protection Act, according to which the commissioner would have the discretion to refuse to act on a complaint if the complainant did not try to resolve the infringement of their rights with the relevant organisation and at least 45 days have passed since then. The right to an effective remedy constitutes a core element of data protection—most individuals will not pursue cases before a court, because of the lengthy, time- consuming and costly nature of judicial proceedings—and acts as a deterrent against data protection violations, in so far as victims can obtain meaningful redress. Administrative remedies are particularly useful, because they focus on addressing malpractice and obtaining meaningful changes in how personal data is handled in practice.
However, the ICO indicates that in 2021-22 it did not serve a single GDPR enforcement notice, secured no criminal convictions and issued only four GDPR fines, totalling just £633,000, despite the fact that it received over 40,000 data subject complaints. Moreover, avenues to challenge ICO inaction are extremely limited. Scrutiny of the information tribunal has been restricted to a purely procedural as opposed to a substantive nature. It was narrowed even further by the Administrative Court decision, which found that the ICO was not obliged to investigate each and every complaint.
Amendment 153 would remove Clause 45. The ICO already enjoys a wide margin of discretion and little accountability for how it handles complaints. In light of its poor performance, it does not seem appropriate to expand the discretion of the new information commission even further. It would also extend the scope of orders under Section 166 of the Data Protection Act to the appropriateness of the commissioner’s response to a complaint. This would allow individuals to promote judicial scrutiny over decisions that have a fundamental impact into how laws are enforced in practice and it would increase the overall accountability of the new information commission.
We have signed Amendment 154, in the name of the noble Baroness, Lady Jones, and I look forward to hearing what she says on that. I apologise for the late tabling of Amendments 154A to 154F, which are all related to Amendments 155 and 175. Clause 47 sets out changes in procedure in the courts, in relation to the right of information of a data subject under the 2018 Act, but there are other issues that need resolving around the jurisdiction of the courts and the Upper Tribunal in data protection cases. That is the reason for tabling these amendments.
The High Court’s judgment in the Delo v ICO case held that part of the reasoning in Killock and Veale about the relative jurisdiction of the courts and tribunals was wrong. The Court of Appeal’s decision in the Delo case underlines concerns, but does not properly address the jurisdictions’ limits in Sections 166 and 167 of the 2018 Act, regarding the distinction between determining procedural failings or the merits of decisions by the ICO. Surely jurisdiction under these sections should be in either the courts or the tribunals, not both. In the view of many, including me, it should be in the tribunals. That is what these amendments seek.
It is clear from these two judgments that there was disagreement on the extent of the jurisdiction of tribunals and courts, notably between Mrs Justice Farbey and Mr Justice Mostyn. The commissioner submitted very different submissions to the Upper Tribunal, the High Court and the Court of Appeal, in relation to the extent and limits of Sections 166 and 167. It is not at all clear what Parliament’s intentions were, when passing the 2018 Act, on the extents and limits of the powers in these sections and whether the appropriate source of redress is a court or tribunal.
This has resulted in jurisdictional confusion. A large number of claims have been brought in either the courts or the tribunals, under either Section 166 or Section 167, and the respective court or tribunal has frequently ruled that the claim should have been made under the other section and it therefore does not have jurisdiction, so that the claim is struck out. The Bill offers a prime opportunity to resolve this issue.
Clause 45(5), which creates new Section 166A, would only blur the lines even more and fortify the reasoning for the claim to be put into the tribunals, rather than the courts. These amendments would give certainty to the courts and tribunals as to their powers and would be much less confusing for litigants in person, most of whom do not have the luxury of paying hundreds of thousands in court fees. This itself is another reason for this to remain in the tribunals, which do not charge fees to issue proceedings.
The proposed new clause inserted by Amendment 287 would require the Secretary of State to exercise powers under Section 190 of the 2018 Act to allow public interest organisations to raise data protection complaints on behalf of individuals generally, without the need to obtain the authorisation of each individual being represented. It would therefore implement Article 80(2) of the GDPR, which provides:
“Member States may provide that any body, organisation or association referred to in paragraph 1 of this Article, independently of a data subject’s mandate, has the right to lodge, in that Member State, a complaint with the supervisory authority which is competent pursuant to Article 77 and to exercise the rights referred to in Articles 78 and 79 if it considers that the rights of a data subject under this Regulation have been infringed as a result of the processing”.
The intention behind Article 80(2) is to allow appropriately constituted organisations to bring proceedings concerning infringements of the data protection regulations in the absence of the data subject. That is to ensure that proceedings may be brought in response to an infringement, rather than on the specific facts of an individual’s case. As a result, data subjects are, in theory, offered greater and more effective protection of their rights. Actions under Article 80(2) could address systemic infringements that arise by design, rather than requiring an individual to evidence the breaches and the specific effects to them.
At present, an affected individual—a data subject—is always required to bring a claim or complaint to a supervisory authority. Whether through direct action or under Section 187 of the 2018 Act, a data subject will have to be named and engaged. In practice, a data subject is not always identifiable or willing to bring action to address even the most egregious conduct.
Article 80(2) would fill a gap that Article 80(1) and Section 187 of the Data Protection Act are not intended to fill. Individuals can be unwilling to seek justice, exercise their rights and lodge data protection complaints on their own, either for fear of retaliation from a powerful organisation or because of the stigma that may be associated with the matter where a data protection violation occurred. Even a motivated data subject may be unwilling to take action due to the risks involved. For instance, it would be reasonable for that data subject not to want to become involved in a lengthy, costly legal process that may be disproportionate to the loss suffered or remedy available. This is particularly pressing where the infringement concerns systemic concerns rather than where an individual has suffered material or non-material damage as a result of the infringement.
Civil society organisations have long helped complainants navigate justice systems in seeking remedies in the data protection area, providing a valuable addition to the enactment of UK data protection laws. My Amendment 287 would allow public interest organisations to lodge representative complaints, even without the mandate of data subjects, to encourage the filing of well-argued, strategically important cases with the potential to improve significantly the data subject landscape as a whole. This Bill is the ideal opportunity for the Government to implement fully Article 80(2) of the GDPR from international law and plug a significant gap in the protection of UK citizens’ privacy.
In effect, this is unfinished business from our debates on the 2018 Act, when we made several attempts to persuade the Government of the merits of introducing the rights under Article 80(2). I hope that the Government will think again. These are extremely important rights and are available in many other countries governed by a similar GDPR. I beg to move.
My Lords, as a veteran of the 2018 arguments on Article 80(2), I rise in support of Amendment 287, which would see its implementation.
Understanding and exercising personal data rights is not straightforward. Even when the rights are being infringed, it is rare that an individual data subject has the time, knowledge or ability to make a complaint to the ICO. This is particularly true for vulnerable groups, including children and the elderly, disadvantaged groups and other groups of people, such as domestic abuse survivors or members of the LGBTQ community, who may have specific reasons for not identifying themselves in relation to a complaint. It is a principle in law that a right that cannot be activated is not fully given.
A data subject’s ability to claim protection is constrained by a range of factors, none of which relates to the validity of their complaint or the level of harm experienced. Rather, the vast majority are prevented from making a complaint by a lack of expertise, capacity, time and money; by the fact that they are not aware that they have data rights; or by the fact that they understand neither that their rights have been infringed nor how to make a complaint about them.
I have considerable experience of this. I remind the Committee that I am chair of the 5Rights Foundation, which has raised important and systemic issues of non-compliance with the AADC. It has done this primarily by raising concerns with the ICO, which has then undertaken around 40 investigations based on detailed submissions. However, because the information is not part of a formalised process, the ICO has no obligation to respond to the 5Rights Foundation team, the three-month time limit for complaints does not apply and, even though forensic work by the 5Rights Foundation identified the problem, its team is not consulted or updated on progress or the outcome—all of which would be possible had it submitted the information as a formal complaint. I remind the Committee that in these cases we are talking about complaints involving children.
My Lords, just so that the Minister might get a little note, I will ask a question. He has explained what is possible—what can be done—but not why the Government still resist putting Article 80(2) into effect. What is the reason for not adopting that article?
The reason was that an extensive consultation was undertaken in 2021 by the Government, and the Government concluded at that time that there was insufficient evidence to take what would necessarily be a complex step. That was largely on the grounds that class actions of this type can go forward either as long as they have the consent of any named individuals in the class action or on behalf of a group of individuals who are unnamed and not specifically raised by name within the investigation itself.
Perhaps the Minister could in due course say what evidence would help to persuade the Government to adopt the article.
I want to help the Minister. Perhaps he could give us some more detail on the nature of that consultation and the number of responses and what people said in it. It strikes me as rather important.
Fair enough. Maybe for the time being, it will satisfy the Committee if I share a copy of that consultation and what evidence was considered, if that would work.
I will turn now to Amendments 154A to 155 and Amendment 175, which propose sweeping modifications to the jurisdiction of the court and tribunal for proceedings under the Data Protection Act 2018. These amendments would have the effect of making the First-tier Tribunal and Upper Tribunal responsible for all data protection cases, transferring both ongoing and future cases out of the court system and to the relevant tribunals.
The Government of course want to ensure that proceedings for enforcement of data protection rules, including redress routes available to data subjects, are appropriate for the nature of the complaint. As the Committee will be well aware, at present there is a mixture of jurisdiction for tribunals and courts under data protection legislation, depending on the precise nature of the proceedings in question. Tribunals are indeed the appropriate venue for some data protection proceedings, and the legislation already recognises that—for example, for application by data subjects for an order requiring the ICO to progress their complaint. However, courts are generally the more appropriate venue for cases involving claims for compensation and successful parties can usually recover their costs. Courts also apply stricter rules of procedure and evidence than tribunals. That is because some cases are appropriate to fall under the jurisdiction of the tribunal, while others are more appropriate for court jurisdiction. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensatory damages for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in accordance with its strict procedural and evidential rules, where the data subject may recover their costs if successful.
As such, the Government are confident that the current system is balanced and proportionate and provides clear and effective administrative and judicial redress routes for data subjects seeking to exercise their rights.
My Lords, is the Minister saying that there is absolutely no confusion between the jurisdiction of the tribunals and the courts? That is, no court has come to a different conclusion about jurisdiction—for example, as to whether procedural matters are for tribunals and merits are for courts or vice versa. Is he saying that everything is hunky-dory and clear and that we do not need to concern ourselves with this crossover of jurisdiction?
No, as I was about to say, we need to take these issues seriously. The noble Lord raised a number of specific cases. I was unfamiliar with them at the start of the debate—
I will go away and look at those; I look forward to learning more about them. There are obvious implications in what the noble Lord said as to the most effective ways of distributing cases between courts and other channels.
For these reasons, I hope that the noble Lord will withdraw his amendment.
I would be very happy to participate in that discussion, absolutely.
My Lords, I thank the Minister for his response. I have surprised myself: I have taken something positive away from the Bill.
The noble Baroness, Lady Jones, was quite right to be more positive about Clause 44 than I was. The Minister unpacked its relationship with Clause 45 well and satisfactorily. Obviously, we will read Hansard before we jump to too positive a conclusion.
On Article 80(2), I am grateful to the Minister for agreeing both to go back to the consultation and to look at the kinds of evidence that were brought forward, because this is a really important aspect for many civil society organisations. He underestimates the difficulties faced when bringing complaints of this nature. I would very much like this conversation to go forward because this issue has been quite a bone of contention; the noble Baroness, Lady Kidron, remembers that only too well. We may even have had ping-pong on the matter back in 2017. There is an appetite to keep on the case so, the more we can discuss this matter—between Committee and Report in particular—the better, because there is quite a head of steam behind it.
As far as the jurisdiction point is concerned, I think this may be the first time I have heard a Minister talk about the Sorting Hat. I was impressed: I have often compared this place to Hogwarts but the concept of using the Sorting Hat to decide whether a case goes to a tribunal or a court is a wonderful one. You would probably need artificial intelligence to do that kind of thing nowadays; that in itself is a bit of an issue because, after all, these may be elaborate amendments but, as the noble Lord, Lord Bassam, said, the case being made here is about the possibility of there being confusion and things not being clear in terms of where jurisdiction lies. It is really important that we determine whether the courts and tribunals themselves understand this and, perhaps more appropriately, whether they have differing views about it.
We need to get to grips with this; the more the Minister can dig into it, and into Delo, Killock and so on, the better. We are all in the foothills here but I am certainly not going to try to unpack those two judgments and the differences between Mrs Justice Farbey and Mr Justice Mostyn, which are well beyond my competency. I thank the Minister.
My Lords, the UK has rightly moved away from the EU concept of supremacy, under which retained EU law would always take precedence over domestic law when they were in conflict. That is clearly unacceptable now that we have left the EU. However, we understand that the effective functioning of our data protection legislation is of critical importance and it is appropriate for us to specify the appropriate relationship between UK and EU-derived pieces of legislation following implementation of the Retained EU Law (Revocation and Reform) Act, or REUL. That is why I am introducing a number of specific government amendments to ensure that the hierarchy of legislation works in the data protection context. These are Amendments 156 to 164 and 297.
Noble Lords may be aware that Clause 49 originally sought to clarify the relationship between the UK’s data protection legislation, specifically the UK GDPR and EU-derived aspects of the Data Protection Act 2018, and future data processing provisions in other legislation, such as powers to share or duties to disclose personal data, as a result of some legal uncertainty created by the European Union (Withdrawal) Act 2018. To resolve this uncertainty, Clause 49 makes it clear that all new data processing provisions in legislation should be read consistently with the key requirements of the UK data protection legislation unless it is expressly indicated otherwise. Since its introduction, the interpretation of pre-EU exit legislation has been altered and there is a risk that this would produce the wrong effect in respect of the interpretation of existing data processing provisions that are silent about their relationship with the data protection legislation.
Amendment 159 will make it clear that the full removal of the principle of EU law supremacy and the creation of a reverse hierarchy in relation to assimilated direct legislation, as provided for in the REUL Act, do not change the relationship between the UK data protection legislation and existing legislation that is in force prior to commencement of Clause 49(2). Amendment 163 makes a technical amendment to the EU withdrawal Act, as amended, to support this amendment.
Amendment 162 is similar to the previous amendment but it concerns the relationship between provisions relating to certain obligations and rights under data protection legislation and on restrictions and prohibitions on the disclosure of information under other existing legislation. Existing Section 186 of the Data Protection Act 2018 governs this relationship. Amendment 162 makes it clear that the relationship between these two types of provision is not affected by the changes to the interpretation of legislation that I have already referred to made by the REUL Act. Additionally, it clarifies that, in relation to pre-commencement legislation, Section 186(1) may be disapplied expressly or impliedly.
Amendment 164 relates to the changes brought about by the REUL Act and sets out that the provisions detailed in earlier Amendments 159, 162 and 163 are to be treated as having come into force on 1 January 2024—in other words, at the same time as commencement of the relevant provisions of the REUL Act.
Amendment 297 provides a limited power to remove provisions that achieve the same effect as new Section 183A from legislation made or passed after this Bill receives Royal Assent, as their presence could cause confusion.
Finally, Amendments 156 and 157 are consequential. Amendments 158, 160 and 161 are minor drafting changes made for consistency, updating and consequential purposes.
Turning to the amendments introduced by the noble Lord, Lord Clement-Jones, I hope that he can see from the government amendments to Clause 49 that we have given a good deal of thought to the impact of the REUL Act 2023 on the UK’s data protection framework and have been prepared to take action on this where necessary. We have also considered whether some of the changes made by the REUL Act could cause confusion about how the UK GDPR and the Data Protection Act 2018 interrelate. Following careful analysis, we have concluded that they would largely continue to be read alongside each other in the intended way, with the rules of the REUL Act unlikely to interfere with this. Any new general rule such as that suggested by the noble Lord could create confusion and uncertainty.
Amendments 168 to 170, 174, 174A and 174B seek to reverse changes introduced by the REUL Act at the end of 2023, specifically the removal of EU general principles from the statute book. EU general principles and certain EU-derived rights had originally been retained by the European Union (Withdrawal) Act to ensure legal continuity at the end of the transition period, but this was constitutionally novel and inappropriate for the long term.
The Government’s position is that EU law concepts should not be used to interpret domestic legislation in perpetuity. The REUL Act provided a solution to this by repealing EU general principles from UK law and clarifying the approach to be taken domestically. The amendments tabled by the noble Lord, Lord Clement-Jones, would undo this important work by reintroducing to the statute book references to rights and principles which have not been clearly defined and are inappropriate now that we have left the EU.
The protection of personal data already forms part of the protection offered by the European Convention on Human Rights, under the Article 8 right to respect for private and family life, and is further protected by our data protection legislation. The UK GDPR and the Data Protection Act 2018 provide a comprehensive set of rules for organisations to follow and rights for people in relation to the use of their data. Seeking to apply an additional EU right to data protection in UK law would not significantly affect the way the data protection framework functions or enhance the protections it affords to individuals. Indeed, doing so may well add unnecessary uncertainty and complexity.
Amendments 171 to 173 pertain to exemptions to specified data subject rights and obligations on data controllers set out in Schedules 2 to 4 to the DPA 2018. The 36 exemptions apply only in specified circumstances and are subject to various safeguards. Before addressing the amendments the noble Lord has tabled, it is perhaps helpful to set out how these exemptions are used. Personal data must be processed according to the requirements set out in the UK GDPR and the DPA 2018. This includes the key principles of lawfulness, fairness and transparency, data minimisation and purpose limitation, among others. The decision to restrict data subjects’ rights, such as the right to be notified that their personal data is being processed, or limit obligations on the data controller, comes into effect only if and when the decision to apply an exemption is taken. In all cases, the use of the exemption must be both necessary and proportionate.
One of these exemptions, the immigration exemption, was recently amended in line with a court ruling that found it was incompatible with the requirements set out in Article 23. This exemption is used by the Home Office. The purpose of Amendments 171 to 173 is to extend the protections applied to the immigration exemption across the other exemptions subject to Article 23, apart from in Schedule 4, where the requirement to consider whether its application prejudices the relevant purposes is not considered relevant.
The other exemptions are each used in very different circumstances, by different data controllers—from government departments to SMEs—and work by applying different tests that function in a wholly different manner from the immigration exemption. This is important to bear in mind when considering these broad-brush amendments. A one-size-fits-all approach would not work across the exemption regime.
It is the Government’s position that any changes to these important exemptions should be made only after due consideration of the circumstances of that particular exemption. In many cases, these amendments seek to make changes that run counter to how the exemption functions. Making changes across the exemptions via this Bill, as the noble Lord’s amendments propose, has the potential to have significant negative impacts on the functioning of the exemptions regime. Any potential amendments to the other exemptions would require careful consideration. The Government note that there is a power to make changes to the exemptions in the DPA 2018, if deemed necessary.
For the reasons I have given, I look forward to hearing more from the noble Lord on his amendments, but I hope that he will not press them. I beg to move.
My Lords, I thank the Minister for that very careful exposition. I feel that we are heavily into wet towel, if not painkiller, territory here, because this is a tricky area. As the Minister might imagine, I will not respond to his exposition in detail, at this point; I need to run away and get some external advice on the impact of what he said. He is really suggesting that the Government prefer a pick ‘n’ mix approach to what he regards as a one size fits all. I can boil it down to that. He is saying that you cannot just apply the rules, in the sense that we are trying to reverse some of the impacts of the previous legislation. I will set out my stall; no doubt the Minister and I, the Box and others, will read Hansard and draw our own conclusions at the end, because this is a complicated area.
Until the end of 2023, the Data Protection Act 2018 had to be read compatibly with the UK GDPR. In a conflict between the two instruments, the provisions of the UK GDPR would prevail. The reversing of the relationship between the 2018 Act and the UK GDPR, through the operation of the Retained EU Law (Revocation and Reform) Act—REUL, as the Minister described it—has had the effect of lowering data protection rights in the UK. The case of the Open Rights Group and the3million v the Secretary of State for the Home Office and the Secretary of State for Digital, Culture, Media and Sport was decided after the UK had left the EU, but before the end of 2023. The Court of Appeal held that exemptions from data subject rights in an immigration context, as set out in the Data Protection Act, were overly broad, contained insufficient safeguards and were incompatible with the UK GDPR. The court disapplied the exemptions and ordered the Home Office to redraft them to include the required safeguards. We debated the regulations the other day, and many noble Lords welcomed them on the basis that they had been revised for the second time.
This sort of challenge is now not possible, because the relationship between the DPA and the UK GDPR has been turned on its head. If the case were brought now, the overly broad exemptions in the DPA would take precedence over the requirement for safeguards set out in the UK GDPR. These points were raised by me in the debate of 12 December, when the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023 were under consideration. In that debate, the noble Baroness, Lady Swinburne, stated that
“we acknowledge the importance of making sure that data processing provisions in wider legislation continue to be read consistently with the data protection principles in the UK GDPR … Replication of the effect of UK GDPR supremacy is a significant decision, and we consider that the use of primary legislation is the more appropriate way to achieve these effects, such as under Clause 49 where the Government consider it appropriate”.—[Official Report, 12/12/23; col. GC 203.]
This debate on Clause 49 therefore offers an opportunity to reinstate the previous relationship between the UK GDPR and the Data Protection Act. The amendment restores the hierarchy, so that it guarantees the same rights to individuals as existed before the end of 2023, and avoids unforeseen consequences by resetting the relationship between the UK GDPR and the DPA 2018 to what the parliamentary draftsmen intended when the Act was written. The provisions in Clause 49, as currently drafted, address the relationship between domestic law and data protection legislation as a whole, but the relationship between the UK GDPR and the DPA is left in its “reversed” state. This is confirmed in the Explanatory Notes to the Bill at paragraph 503.
The purpose of these amendments is to restore data protection rights in the UK to what they were before the end of 2023, prior to the coming into force of REUL. The amendments would restore the fundamental right to the protection of personal data in UK law; ensure that the UK GDPR and the DPA continue to be interpreted in accordance with the fundamental right to the protection of personal data; ensure that there is certainty that assimilated case law that references the fundamental right to the protection of personal data still applies; and apply the protections required in Article 23 of the UK GDPR to all the relevant exemptions in Schedule 2 to the Data Protection Act. This is crucial in avoiding diminishing trust in our data protection frameworks. If people do not trust that their data is protected, they will refuse to share it. Without this data, new technologies cannot be developed, because these technologies rely on personal data. By creating uncertainty and diminishing standards, the Government are undermining the very growth in new technologies that they want.
My Lords, I have added my name to Amendment 195ZA—I will get to understand where these numbers come from, at some point—in the name of the noble Lord, Lord Kamall, who introduced it so eloquently. I will try to be brief in my support.
For many people, probably most, the use of online digital verification will be a real benefit. The Bill puts in place a framework to strengthen digital verification so, on the whole, I am supportive of what the Government are trying to do, although I think that the Minister should seriously consider the various amendments that the noble Baroness, Lady Jones of Whitchurch, has proposed to strengthen parliamentary scrutiny in this area.
However, not everyone will wish to use digital verification in all cases, perhaps because they are not sufficiently confident with technology or perhaps they simply do not trust it. We have already heard the debates around the advances of AI and computer-based decision-making. Digital identity verification could be seen to be another extension of this. There is a concern that Part 2 of the Bill appears to push people ever further towards decisions being taken by a computer.
I suspect that many of us will have done battle with some of the existing identity verification systems. In my own case, I can think of one bank where I gave up in deep frustration as it insisted on telling me that I was not the same person as my driving licence showed. I have also come up against systems used by estate agents when trying to provide a guarantee for my student son that was so intrusive that I, again, refused to use it.
Therefore, improving verification services is to be encouraged but there must be some element of choice, and if someone does not have the know-how, confidence, or trust in the systems, they should be able to do so through some non-digital alternative. They should not be barred from using relevant important services such as, in my examples, banking and renting a property because they cannot or would prefer not to use a digital verification service.
At the very least, even if the Minister is not minded to accept that amendment, I hope that he can make clear that the Government have no intention to make digital ID verification mandatory, as some have suggested that this Part 2 may be driving towards.
My Lords, this is quite a disparate group of amendments. I support Amendment 195ZA, which I have signed. I thought that the noble Baroness, Lady Jones, and the noble Lords, Lord Kamall and Lord Vaux, have made clear the importance of having a provision such as this on the statute book. It is important that an individual can choose whether to use digital or non-digital means of verifying their identity. It is important for the liberty and equality of individuals as well as to cultivate trust in what are essentially growing digital identity systems. The use of the word “empower” in these circumstances is important. We need to empower people rather than push them into digital systems that they may not be able to access. Therefore, a move towards digitalisation is not a justification for compelling individuals to use systems that could compromise their privacy or rights more broadly. I very much support that amendment on that basis.
I also very much support the amendments of the noble Baroness, Lady Jones, which I have signed. The Delegated Powers and Regulatory Reform Committee could not have made its recommendations clearer. The Government are serial offenders in terms of skeleton Bills. We have known that from remarks made by the noble Lord, Lord Hodgson, on the Government Benches over a long period. I am going to be extremely interested in what the Government have to say. Quite often, to give them some credit, they listen to what the DPRRC has to say and I hope that on this occasion the Minister is going to give us some good news.
This is an extremely important new system being set up by the Government. We have been waiting for the enabling legislation for quite some time. It is pretty disappointing, after all the consultations that have taken place, just how skeletal it is. No underlying principles have been set out. There is a perfectly good set of principles set out by the independent Privacy and Consumer Advisory Group that advises the Government on how to provide a simple, trusted and secure means of accessing public services. But what assurance do we have that we are going to see those principles embedded in this new system?
Throughout, it is vital that the Secretary of State is obliged to uphold the kinds of concerns being raised in the development of this DVS trust framework to ensure that those services protect the people who use them. We need that kind of parliamentary debate and it has been made quite clear that we need nothing less than that. I therefore very much support what the noble Baroness, Lady Jones, had to say on that subject.
I thank the noble Lord, Lord Clement-Jones, the noble Baroness, Lady Jones, and my noble friend Lord Kamall for their amendments. To address the elephant in the room first, I can reassure noble Lords that the use of digital identity will not be mandatory, and privacy will remain one of the guiding principles of the Government’s approach to digital identity. There are no plans to introduce a centralised, compulsory digital ID system for public services, and the Government’s position on physical ID cards remains unchanged. The Government are committed to realising the benefits of digital identity technologies without creating ID cards.
I shall speak now to Amendment 177, which would require the rules of the DVS trust framework to be set out in regulations subject to the affirmative resolution procedure. I recognise that this amendment, and others in this group, reflect recommendations from the DPRRC. Obviously, we take that committee very seriously, and we will respond to that report in due course, but ahead of Report.
Part 2 of the Bill will underpin the DVS trust framework, a document of auditable rules, which include technical standards. The trust framework refers to data protection legislation and ICO guidance. It has undergone four years of development, consultation and testing within the digital identity market. Organisations can choose to have their services certified against the trust framework to prove that they provide secure and trustworthy digital verification services. Certification is provided by independent conformity assessment bodies that have been accredited by the UK Accreditation Service. Annual reviews of the trust framework are subject to consultation with the ICO and other appropriate persons.
Requiring the trust framework to be set out in regulations would make it hard to introduce reactive changes. For example, if a new cybersecurity threat emerged which required the rapid deployment of a fix across the industry, the trust framework would need to be updated very quickly. Developments in this fast-growing industry require an agile approach to standards and rule-making. We cannot risk the document becoming outdated and losing credibility with industry. For these reasons, the Government feel that it is more appropriate for the Secretary of State to have the power to set the rules of the trust framework with appropriate consultation, rather than for the power to be exercised by regulations.
I turn to Amendments 178 to 195, which would require the fees that may be charged under this part of the Bill to be set out in regulations subject to the negative resolution procedure. The Government have committed to growing a market of secure and inclusive digital identities as an alternative to physical proofs of identity, for those that choose to use them. Fees will be introduced only once we are confident that doing so will not restrict the growth of this market, but the fee structure, when introduced, is likely to be complex and will need to flex to support growth in an evolving market.
There are built-in safeguards to this fee-charging power. First, there is a strong incentive for the Secretary of State to set fees that are competitive, fair and reasonable, because failing to do so would prevent the Government realising their commitment to grow this market. Secondly, these fee-raising powers have a well-defined purpose and limited scope. Thirdly, the Secretary of State will explain in advance what fees she intends to charge and when she intends to charge them, which will ensure the appropriate level of transparency.
The noble Baroness, Lady Jones, asked about the arrangements for the office for digital identities and attributes. It will not initially be independent, as it will be located within the Department for Science, Innovation and Technology. As we announced in the government response to our 2021 consultation, we intend for this to be an interim arrangement until a suitable long-term home for the governing body can be identified. Delegating the role of Ofdia—as I suppose we will call it—to a third party in the future, is subject to parliamentary scrutiny, as provided for by the clauses in the Bill. Initially placing Ofdia inside government will ensure that its oversight role could mature in the most effective way and that it supports the digital identity market in meeting the needs of individual users, relying parties and industry.
Digital verification services are independently certified against the trust framework rules by conformity assessment bodies. Conformity assessment bodies are themselves independently accredited by the UK Accreditation Service to ensure that they have the competence and impartiality to perform certification. The trust framework certification scheme will be accredited by the UK Accreditation Service to give confidence that the scheme can be efficiently and competently used to certify products, processes and services. All schemes will need to meet internationally agreed standards set out by the UK Accreditation Service. Ofdia, as the owner of the main code, will work with UKAS to ensure that schemes are robust, capable of certification and operated in line with the trust framework.
Amendment 184A proposes to exclude certified public bodies from registering to provide digital verification services. The term “public bodies” could include a wide range of public sector entities, including institutions such as universities, that receive any public funding. The Government take the view that this exclusion would be unnecessarily restrictive in the UK’s nascent digital identity market.
Amendment 195ZA seeks to mandate organisations to implement a non-digital form of verification in every instance where a digital method is required. The Bill enables the use of secure and inclusive digital identities across the economy. It does not force businesses or individuals to use them, nor does it insist that businesses which currently accept non-digital methods of verification must transition to digital methods. As Clause 52 makes clear, digital verification services are services that are provided at the request of the individual. The purpose of the Bill is to ensure that, when people want to use a digital verification service, they know which of the available products and services they can trust.
Some organisations operate only in the digital sphere, such as online-only banks and energy companies. To oblige such organisations to offer manual document checking would place obligations on them that would go beyond the Government’s commitment to do only what is necessary to enable the digital identity market to grow. In so far as this amendment would apply to public authorities, the Equality Act requires those organisations to consider how their services will affect people with protected characteristics, including those who, for various reasons, might not be able or might choose not to use a digital identity product.
Is the Minister saying that, as a result of the Equality Act, there is an absolute right to that analogue—if you like—form of identification if, for instance, someone does not have access to digital services?
I understand that some services are purely digital, but some of those may well not have digital ID. We do not know what future services there might be, so they might want to show an analogue ID. Is my noble friend saying that that will not be possible because it will impose too much of a burden on those innovative digital companies? Could he clarify what he said?
On this point, the argument that the Government are making is that, where consumers want to use a digital verification service, all the Bill does is to provide a mechanism for those DVSs to be certified and assured to be safe. It does not seek to require anything beyond that, other than creating a list of safe DVSs.
The Equality Act applies to the public sector space, where it needs to be followed to ensure that there is an absolute right to inclusive access to digital technologies.
My Lords, in essence, the Minister is admitting that there is a gap when somebody who does not have access to digital services needs an identity to deal with the private sector. Is that right?
In the example I gave, I was not willing to use a digital system to provide a guarantee for my son’s accommodation in the private sector. I understand that that would not be protected and that, therefore, someone might not be able to rent a flat, for example, because they cannot provide physical ID.
The Bill does not change the requirements in this sense. If any organisation chooses to provide its services on a digital basis only, that is up to that organisation, and it is up to consumers whether they choose to use it. It makes no changes to the requirements in that space.
I will now speak to the amendment that seeks to remove Clause 80. Clause 80 enables the Secretary of State to ask accredited conformity assessment bodies and registered DVS providers to provide information which is reasonably required to carry out her functions under Part 2 of the Bill. The Bill sets out a clear process that the Secretary of State must follow when requesting this information, as well as explicit safeguards for her use of the power. These safeguards will ensure that DVS providers and conformity assessment bodies have to provide only information necessary for the functioning of this part of the Bill.
My Lords, the clause stand part amendment was clearly probing. Does the Minister have anything to say about the relationship with OneLogin? Is he saying that it is only information about systems, not individuals, which does not feed into the OneLogin identity system that the Government are setting up?
It is very important that the OneLogin system is entirely separate and not considered a DVS. We considered whether it should be, but the view was that that comes close to mandating a digital identity system, which we absolutely want to avoid. Hence the two are treated entirely differently.
That is a good reassurance, but if the Minister wants to unpack that further by correspondence, I would be very happy to have that.
I am very happy to do so.
I turn finally to Amendments 289 and 300, which aim to introduce a criminal offence of digital identity theft. The Government are committed to tackling fraud and are confident that criminal offences already exist to cover the behaviour targeted by these amendments. Under the Fraud Act 2006, it is a criminal offence to make a gain from the use of another person’s identity or to cause or risk a loss by such use. Where accounts or databases are hacked into, the Computer Misuse Act 1990 criminalises the unauthorised access to a computer programme or data held on a computer.
Furthermore, the trust framework contains rules, standards and good practice requirements for fraud monitoring and responding to fraud. These rules will further defend systems and reduce opportunities for digital identity theft.
My Lords, I am sorry, but this is a broad-ranging set of amendments, so I need to intervene on this one as well. When the Minister does his will write letter in response to today’s proceedings, could he tell us what guidance there is to the police on this? Because when the individual, Mr Arron, approached the police, they said, “Oh, sorry, there’s nothing we can do; identity theft is not a criminal offence”. The Minister seems to be saying, “No, it is fine; it is all encompassed within these provisions”. While he may be saying that, and I am sure he will be shouting it from the rooftops in the future, the question is whether the police have guidance; does the College of Policing have guidance and does the Home Office have guidance? The ordinary individual needs to know that it is exactly as the Minister says, and identity theft is covered by these other criminal offences. There is no point in having those offences if nobody knows about them.
That is absolutely fair enough: I will of course write. Sadly, we are not joined today by ministerial colleagues from the Home Office, who have some other Bill going on.
I have no doubt that its contribution to the letter will be equally enjoyable. However, for all the reasons I set out above, I am not able to accept these amendments and respectfully encourage the noble Baroness and noble Lords not to press them.
My Lords, this group has three amendments within it and, as the noble Lord, Lord Vaux, said, it is a disparate group. The first two seem wholly benign and entirely laudable, in that they seek to ensure that concerns about the environmental impacts related to data connected to business are shared and provided. The noble Baroness, Lady Bennett, said hers was a small and modest amendment: I agree entirely with that, but it is valuable nevertheless.
If I had to choose which amendment I prefer, it would be the second, in the name of my noble friend Lady Young, simply because it is more comprehensive and seems to be of practical value in pursuing policy objectives related to climate change mitigation. I cannot see why the disclosure of an impact analysis of current and future announcements, including legislation, changes in targets and large contracts, on UK climate change mitigation targets would be a problem. I thought my noble friend was very persuasive and her arguments about impact assessment were sound. The example of offshore petroleum legislation effectively not having an environmental impact assessment when its impacts are pretty clear was a very good one indeed. I am one of those who believes that environmental good practice should be written all the way through, a bit like a stick of Brighton rock, and I think that about legislation. It is important that we take on board that climate change is the most pressing issue that we face for the future.
The third amendment, in the name of my noble friend Lady Jones, is of a rather different nature, but is no less important, as it relates to the UK’s data adequacy and the EU’s decisions on it. We are grateful to the noble Lords, Lord Vaux of Harrowden and Lord Clement-Jones, for their support. Put simply, it would oblige the Secretary of State to complete an assessment, within six months of the Bill’s passing,
“of the likely impact of the Act on the EU’s data adequacy decisions relating to the UK”.
It would oblige the Secretary of State to lay a report on the assessment’s findings, and the report must cover data risk assessments and the impact on SMEs. It must also include an estimate of the legislation’s financial impact. The noble Lord, Lord Vaux, usefully underlined the importance of this, with its critical 2025 date. The amendment also probes
“whether the Government anticipate the provisions of the Bill conflicting with the requirements that need to be made by the UK to maintain a data adequacy decision by the EU”.
There is widespread and considerable concern about data adequacy and whether the UK legislative framework diverges too far from the standards that apply under the EU GDPR. The risk that the UK runs in attempting to reduce compliance costs for the free flow of personal data is that safeguards are removed to the point where businesses and trade become excessively concerned. In summary, many sectors including manufacturing, retail, health, information technology and particularly financial services are concerned that the free flow of data between us and the EU, with minimal disruption, will simply not be able to continue.
As the noble Lord, Lord Vaux, underlined, it is important that we in the UK have a relationship of trust with the European Commission on this, although ultimately data adequacy could be tested in the Court of Justice of the European Union. Data subjects in the EU can rely on the general principle of the protection of personal data to invalidate EU secondary and domestic law conflicting with that principle. Data subjects can also rely on the Charter of Fundamental Rights to bring challenges. Both these routes were closed off when the UK left the EU and the provisions were not saved in UK law, so it can be argued that data protection rights are already at a lower standard than across the European Union.
It is worth acknowledging that adequacy does not necessarily require equivalence. We can have different, and potentially lower, standards than the EU but, as long as those protections are deemed to meet whatever criteria the Commission chooses to apply, it is all to the good.
However, while divergence is possible, the concern that we and others have is that the Bill continues chipping away at standards in too many different ways. This chipping away is also taking place in statutory instruments, changes to guidance and so on. If His Majesty’s Government are satisfied that the overall picture remains that UK regulation is adequate, that is welcome, but it would be useful to know what mechanism DSIT and the Government generally intend using to measure where the tipping point might be achieved and how close these reforms take us to it.
The Committee will need considerable reassurance on the question of data adequacy, not least because of its impact on businesses and financial services in the longer term. At various times, the Minister has made the argument that a Brexit benefit is contained within this legislation. If he is ultimately confident of that case, what would be the impact on UK businesses if that assessment is wrong in relation to data adequacy decisions taken within the EU?
We are going to need more than warm words and a recitation that “We think it’s right and that we’re in the right place on data adequacy”. We are going to need some convincing. Whatever the Minister says today, we will have to return to this issue on Report. It is that important for businesses in this country and for the protection of data subjects.
My Lords, these amendments have been spoken to so well that I do not need to spend a huge amount of time repeating those great arguments. Both Amendment 195A, put forward by the noble Baroness, Lady Bennett, and Amendment 218 have considerable merit. I do not think that they conflict; they are complementary, in many respects.
Awareness raising is important to this, especially in relation to Amendment 218. For instance, if regulators are going to have a growth duty, which looks like it is going to happen, why not have countervailing duties relating to climate change, as the noble Baroness, Lady Young, put forward so cogently as part of Amendment 218? Amendment 195A also has considerable merit in raising awareness in the private sector, in traders and so on. Both have considerable merit.
Before the Minister stands up, let me just say that I absolutely agree with what the noble Lord, Lord Bassam, said. Have the Government taken any independent advice? It is easy to get wrapped up in your own bubble. The Government seem incredibly blithe about this Bill. You only have to have gone through our days in this Committee to see the fundamental changes that are being made to data protection law, yet the Government, in this bubble, seem to think that everything is fine despite the warnings coming from Brussels. Are they taking expert advice from outside? Do they have any groups of academics, for instance, who know about this kind of thing? It is pretty worrying. The great benefit of this kind of amendment, put forward by the noble Baroness, Lady Jones, is that nothing would happen until we were sure that we were going to be data adequate. That seems a fantastic safeguard to me. If the Government are just flying blind on this, we are all in trouble, are we not?
My Lords, can I point out, on the interests of the EU, that it does not go just one way? There is a question around investment as well. For example, any large bank that is currently running a data-processing facility in this country that covers the whole of Europe may decide, if we lose data adequacy, to move it to Europe. Anyone considering setting up such a thing would probably go for Europe rather than here. There is therefore an investment draw for the EU here.
Yes. I would be happy to provide a list of the people we have spoken to about adequacy; it may be a long one. That concludes the remarks I wanted to make, I think.
Perhaps the Minister could just tweak that a bit by listing not just the people who have made positive noises but those who have their doubts.
I do not have the benefit of seeing a Hansard update to know after which word I was interrupted and we had to leave to vote, so I will just repeat, I hope not unduly, the main point I was making at the time of the Division. This was that the central conclusion of the CRISP report is that the Government’s policy
“generates significant gaps in the formal oversight of biometrics and surveillance practices in addition to erasing many positive developments aimed at raising standards and constructive engagement with technology developers, surveillance users and the public”.
The reason I am very glad to support the noble Lord, Lord Holmes, in these amendments is that the complexities of the current regulatory landscape and the protections offered by the BSCC in an era of increasingly intensive advanced and intrusive surveillance mean that the abolition of the BSCC leaves these oversight gaps while creating additional regulatory complexity. I will be interested to see how the Minister defends the fact that this abolition is supposed to improve the situation.
I do not want to detain the Committee for very long, but I shall just read this one passage from the report into the record, because it is relevant to the debate we are having. We should not remove
“a mechanism for assuring Parliament and the public of appropriate surveillance use, affecting public trust and legitimacy at a critical moment concerning public trust in institutions, particularly law enforcement. As drafted, the Bill reduces public visibility and accountability of related police activities. The lack of independent oversight becomes amplified by other sections of the Bill that reduce the independence of the current Information Commissioner role”.
In short, I think it would be a mistake to abolish the biometrics commissioner, and on that basis, I support these amendments.
My Lords, it has been a pleasure to listen to noble Lords’ speeches in this debate. We are all very much on the same page and have very much the same considerations in mind. Both the protection of biometric data itself and also the means by which we regulate its use and have oversight over how it is used have been mentioned by everyone. We may have slightly different paths to making sure we have that protection and oversight, but we all have the same intentions.
The noble Lord, Lord Holmes, pointed to the considerable attractions of, in a sense, starting afresh, but I have chosen a rather different path. I think it was the noble Lord, Lord Vaux, who mentioned Fraser Sampson, the former Biometrics and Surveillance Camera Commissioner. I must admit that I have very high regard for the work he did, and also for the work of such people as Professor Peter Fussey of Essex University. Of course, a number of noble Lords have mentioned the work of CRISP in all this, which kept us very well briefed on the consequence of these clauses.
No one has yet spoken to the stand part notices on Clauses 130 to 132; I will come on to those on Clauses 147 to 149 shortly. The Bill would drastically change the way UK law enforcement agencies can handle biometric personal data. Clauses 130 to 132 would allow for data received from overseas law enforcement agencies to be stored in a pseudonymised, traceable format indefinitely.
For instance, Clause 130 would allow UK law enforcement agencies to hold biometric data received from overseas law enforcement agencies in a pseudonymised format. In cases where the authority ceases to hold the material pseudonymously and the individual has no previous convictions or only one exempt conviction, the data may be retained in a non-pseudonymous format for up to three years. Therefore, the general rule is indefinite retention with continuous pseudonymisation, except for a specific circumstance where non-pseudonymised retention is permitted for a fixed period. I forgive noble Lords if they have to read Hansard to make total sense of that.
This is a major change in the way personal data can be handled. Permitting storage of pseudonymised or non-pseudonymised data will facilitate a vast biometric database that can be traced back to individuals. Although this does not apply to data linked to offences committed in the UK, it sets a concerning precedent for reshaping how law enforcement agencies hold data in a traceable and identifiable way. It seems that there is nothing to stop a law enforcement agency pseudonymising data just to reattach the identifying information, which they would be permitted to hold for three years.
The clauses do not explicitly define the steps that must be taken to achieve pseudonymisation. This leaves a broad scope for interpretation and variation in practice. The only requirement is that the data be pseudonymised
“as soon as reasonably practicable”,
which is a totally subjective threshold. The collective impact of these clauses, which were a late addition to the Bill on Report in the Commons, is deeply concerning. We believe that these powers should be withdrawn to prevent a dangerous precedent being set for police retention of vast amounts of traceable biometric data.
The stand part notices on Clauses 147 to 149 have been spoken to extremely cogently by the noble Lord, Lord Vaux, the noble Viscount, Lord Stansgate, and the noble Baroness, Lady Harding. I will not repeat a great deal of what they said but what the noble Baroness, Lady Harding, said about the Human Fertilisation and Embryology Authority really struck a chord with me. When we had our Select Committee on Artificial Intelligence, we looked at models for regulation and how to gain public trust for new technologies and concepts. The report that Baroness Warnock did into fertilisation and embryology was an absolute classic and an example of how to gain public trust. As the noble Baroness, Lady Harding, said, it has stood the test of time. As far as I am concerned, gaining that kind of trust is the goal for all of us.
What we are doing here risks precisely the reverse by abolishing the office of the Biometrics and Surveillance Camera Commissioner. This was set up under the Protection of Freedoms Act 2012, which required a surveillance camera commissioner to be appointed and a surveillance camera code of practice to be published. Other functions of the Biometrics and Surveillance Camera Commissioner are in essence both judicial and non-judicial. They include developing and encouraging compliance with the surveillance camera code of practice; raising standards for surveillance camera developers, suppliers and users; public engagement; building legitimacy; reporting annually to Parliament via the Home Secretary; convening expertise to support these functions; and reviewing all national security determinations and other powers by which the police can retain biometric data. The Bill proposes to erase all but one—I stress that—of these activities.
The noble Lord, Lord Vaux, quoted CRISP. I will not repeat the quotes he gave but its report, which the noble Viscount, Lord Stansgate, also cited, warns that
“plans to abolish and not replace existing safeguards in this crucial area will leave the UK without proper oversight just when advances in artificial intelligence (AI) and other technologies mean they are needed more than ever”.
The Bill’s reduction of surveillance-related considerations to data protection compares unfavourably to regulatory approaches in other jurisdictions. Many have started from data protection and extended it to cover the wider rights-based implications of surveillance. Here, the Bill proposes a move in precisely the opposite direction. I am afraid this is yet another example of the Bill going entirely in the wrong direction.
My Lords, I thank all noble Lords who have contributed to what has been an excellent debate on this issue. We have all been united in raising our concerns about whether the offices of the biometrics commissioner and the surveillance camera commissioner should be abolished. We all feel the need for more independent oversight, not less, as is being proposed here.
As we know, the original plan was for the work of the biometrics commissioner to be transferred to the Information Commissioner, but when he raised concerns that this would result in the work receiving less attention, it was decided to transfer it to the Investigatory Powers Commissioner instead. Meanwhile, the office of the surveillance camera commissioner is abolished on the basis that these responsibilities are already covered elsewhere. However, like other noble Lords, we remain concerned that the transfer of this increasingly important work from both commissioners will mean that it does not retain the same level of expertise and resources as it enjoys under the current regime.
These changes have caused some alarm among civic society groups such as the Ada Lovelace Institute and the Centre for Research into Information Surveillance and Privacy, to which noble Lords have referred. They argue that we are experiencing a huge expansion in the reach of surveillance and biometric technology. The data being captured, whether faces, fingerprints, walking style, voice or the shape of the human body, are uniquely personal and part of our individual identity. The data being captured can enhance public safety but can also raise critical ethical concerns around privacy, free expression, bias and discrimination. As the noble Lord, Lord Vaux, said, we need a careful balance of those issues between protection and privacy.
The noble Baroness, Lady Harding, quite rightly said that there is increasing public mistrust in the use of these techniques, and that is why there is an urgent need to take people on the journey. The example the noble Baroness gave was vivid. We need a robust legal framework to underpin the use of these techniques, whether it is by the police, the wider public sector or private institutions. As it stands, the changes in the Bill do not achieve that reassurance, and we have a lot of lessons to learn.
Rather than strengthening the current powers to respond to the huge growth and reach of surveillance techniques, the Bill essentially waters down the protections. Transferring the powers from the BSCC to the new Information Commissioner brings the issue down to data protection when the issues of intrusion and the misuse of biometrics and surveillance are much wider than that. Meanwhile, the impact of Al will herald a growth of new techniques such as facial emotional appraisal and video manipulation, leading to such things as deep fakes. All these techniques threaten to undermine our sense of self and our control of our own personal privacy.
The amendment in the name of the noble Lord, Lord Holmes, takes up the suggestion, also made by the Ada Lovelace Institute, to establish a biometrics office within the ICO, overseen by three experienced commissioners. The functions would provide general oversight of biometric techniques, keep a register of biometric users and set up a process for considering complaints. Importantly, it would require all entities processing biometric data to register with the ICO prior to any use.
We believe that these amendments are a really helpful contribution to the discussion. They would place the oversight of biometric techniques in a more effective setting where the full impacts of these techniques can be properly monitored, measured and reported on. We would need more details of the types of work to be undertaken by these commissioners, and the cost implications but, in principle, we support these amendments because they seem to be an answer to our concerns. We thank the noble Lord for tabling them and very much hope the Minister will give the proposals serious consideration.
I apologise if I have misunderstood. It sounds like it would be a unit within the ICO responsible for that matter. Let me take that away if I have misunderstood—I understood it to be a separate organisation altogether.
The Government deem Amendment 238 unnecessary, as using biometric data to categorise or make inferences about people, whether using algorithms or otherwise, is already subject to the general data protection principles and the high data protection standards of the UK’s data protection framework as personal data. In line with ICO guidance, where the processing of biometric data is intended to make an inference linked to one of the special categories of data—for example, race or ethnic origin—or the biometric data is processed for the intention of treating someone differently on the basis of inferred information linked to one of the special categories of data, organisations should treat this as special category data. These protections ensure that this data, which is not used for identification purposes, is sufficiently protected.
Similarly, Amendment 286 intends to widen the scope of the Forensic Information Databases Service—FINDS—strategy board beyond oversight of biometrics databases for the purpose of identification to include “classification” purposes as well. The FINDS strategy board currently provides oversight of the national DNA database and the national fingerprint database. The Bill puts oversight of the fingerprint database on the same statutory footing as that of the DNA database and provides the flexibility to add oversight of new biometric databases, where appropriate, to provide more consistent oversight in future. The delegated power could be used in the medium term to expand the scope of the board to include a national custody image database, but no decisions have yet been taken. Of course, this will be kept under review, and other biometric databases could be added to the board’s remit in future should these be created and should this be appropriate. For the reasons I have set out, I hope that the noble Baroness, Lady Jones of Whitchurch, will therefore agree not to move Amendments 238 and 286.
Responses to the data reform public consultation in 2021 supported the simplification of the complex oversight framework for police use of biometrics and surveillance cameras. Clauses 147 and 148 of the Bill reflect that by abolishing the Biometrics and Surveillance Camera Commissioner’s roles while transferring the commissioner’s casework functions to the Investigatory Powers Commissioner’s Office.
Noble Lords referred to the CRISP report, which was commissioned by Fraser Sampson—the previous commissioner—and directly contradicts the outcome of the public consultation on data reform in 2021, including on the simplification of the oversight of biometrics and surveillance cameras. The Government took account of all the responses, including from the former commissioner, in developing the policies set out in the DPDI Bill.
There will not be a gap in the oversight of surveillance as it will remain within the statutory regulatory remit of other organisations, such as the Information Commissioner’s Office, the Equality and Human Rights Commission, the Forensic Science Regulator and the Forensic Information Databases Service strategy board.
One of the crucial aspects has been the reporting of the Biometrics and Surveillance Camera Commissioner. Where is there going to be and who is going to have a comprehensive report relating to the use of surveillance cameras and the biometric data contained within them? Why have the Government decided that they are going to separate out the oversight of biometrics from, in essence, the surveillance aspects? Are not the two irretrievably brought together by things such as live facial recognition?
Yes. There are indeed a number of different elements of surveillance camera oversight; those are reflected in the range of different bodies doing that it. As to the mechanics of the production of the report, I am afraid that I do not know the answer.
Does the Minister accept that the police are one of the key agencies that will be using surveillance cameras? He now seems to be saying, “No, it’s fine. We don’t have one single oversight body; we had four at the last count”. He probably has more to say on this subject but is that not highly confusing for the police when they have so many different bodies that they need to look at in terms of oversight? Is it any wonder that people think the Bill is watering down the oversight of surveillance camera use?
No. I was saying that there was extensive consultation, including with the police, and that that has resulted in these new arrangements. As to the actual mechanics of the production of an overall report, I am afraid that I do not know but I will find out and advise noble Lords.
His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services also inspects, monitors and reports on the efficiency and effectiveness of the police, including their use of surveillance cameras. All of these bodies have statutory powers to take the necessary action when required. The ICO will continue to regulate all organisations’ use of these technologies, including being able to take action against those not complying with data protection law, and a wide range of other bodies will continue to operate in this space.
On the first point made by the noble Lord, Lord Vaux, where any of the privacy concerns he raises concern information that relates to an identified or identifiable living individual, I can assure him that this information is covered by the UK’s data protection regime. This also includes another issue raised by the noble Lord—where the ANPR captures a number-plate that can be linked to an identifiable living individual—as this would be the processing of personal data and thus governed by the UK’s data protection regime and regulated by the ICO.
For the reasons I have set out, I maintain that these clauses should stand part of the Bill. I therefore hope that the noble Lord, Lord Clement-Jones, will withdraw his stand part notices on Clauses 147 and 148.
Clause 149 does not affect the office of the Biometrics and Surveillance Camera Commissioner, which the noble Lord seeks to maintain through his amendment. The clause’s purpose is to update the name of the national DNA database board and update its scope to include the national fingerprint database within its remit. It will allow the board to produce codes of practice and introduce a new delegated power to add or remove biometric databases from its remit in future via the affirmative procedure. I therefore maintain that this clause should stand part of the Bill and hope that the noble Lord will withdraw his stand part notice.
Clauses 147 and 148 will improve consistency in the guidance and oversight of biometrics and surveillance cameras by simplifying the framework. This follows public consultation, makes the most of the available expertise, improves organisational resilience, and ends confusing and inefficient duplication. The Government feel that a review, as proposed, so quickly after the Bill is enacted is unnecessary. It is for these reasons that I cannot accept Amendment 292 in the name of the noble Lord, Lord Clement-Jones.
I turn now to the amendments tabled by the noble Lord, Lord Clement-Jones, which seek to remove Clauses 130 to 132. These clauses make changes to the Counter-Terrorism Act 2008, which provides the retention regime for biometric data held on national security grounds. The changes have been made only following a formal request from Counter Terrorism Policing to the Home Office. The exploitation of biometric material, including from international partners, is a valuable tool in maintaining the UK’s national security, particularly for ensuring that there is effective tripwire coverage at the UK border. For example, where a foreign national applies for a visa to enter the UK, or enters the UK via a small boat, their biometrics can be checked against Counter Terrorism Policing’s holdings and appropriate action to mitigate risk can be taken, if needed.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Work and Pensions
(8 months ago)
Grand CommitteeMy Lords, I do not know how unusual this is, but we are on the same page across both sides of the Committee.
First, having signed the amendments by the noble Lord, Lord Lucas, I express my support for the first batch, Amendments 199 to 201, which are strongly supported by the Advertising Association and the Interactive Advertising Bureau for obvious reasons. The noble Lords, Lord Lucas and Lord Bassam, and the noble Viscount, Lord Chandos, have expressed why they are fundamental to advertising on the internet. Audience measurement is an important function, for media owners in particular, to determine the consumption of content and to price advertising space for advertisers.
I understand that the department, DSIT, has conceded that most of the use cases for audience measurement fit within the term “statistical purposes”. It is this area of performance that is so important. As the noble Lord, Lord Bassam, seemed to indicate, we may be within touching distance of agreement on that, but the Minister needs to be explicit about it so that the industry understands what the intent behind that clause really is. As a number of noble Lords have said, this is a specific and targeted exemption for audience measurement and performance cookies that limits the consent exemption for those purposes and, as such, should definitely be supported. I very much hope that, if the Minister cannot give the necessary assurance now, then, as a number of noble Lords have said, he will engage in further discussions.
Amendments 203, which I have signed, and 205 are extremely important too. Amendment 203, picked up clearly by the noble Lord, Lord Bassam, is potentially important; it could save an awful lot of aggravation for users on the internet. It is potentially game-changing given that, when we approach the same site—even Google—we have to keep clicking the cookie. I very much hope the Minister will see the sense in that because, if we are changing the EC regulations, we need to do something sensible and useful like that. It might even give the Bill a good name.
As all noble Lords have rightly said, the Secretary of State needs to think about the implementation of the regulations and what they will affect. Amendment 202 is fundamental and badly needed. You need only look at the list of those who are absolutely concerned about the centralisation of cookies: the Internet Advertising Bureau, the Advertising Association, the Data & Marketing Association, the Market Research Society, the News Media Association, the Incorporated Society of British Advertisers, the Association of Online Publishers and the Professional Publishers Association. I hope that the Government are in listening mode and will listen to their concerns.
As the PPA says, centralising cookie consent with browsers could cause consumers far more harm than good. The Secretary of State’s powers would override cookie consent relationships between individuals and specialist publishers, which the noble Lord, Lord Bassam, talked about in particular. As the PPA says, in all likelihood a significant number of internet users would not consent to cookies from the browser but would consent to cookies on the websites of publishers that they know and trust. If the Secretary of State were to use this power to enforce cookie centralisation, many publishing businesses would be forced to present consumers with paywalls in order to be financially sustainable. As the PPA says, this would lead to consumers missing the opportunity to access high-quality publishing content without having to pay a fee.
The PPA has made an extremely good case. This would amplify existing barriers to competition in the digital market. There are provisions in the DMCC Bill that would give powers to the CMA to address any problems, such as enforced data sharing from platforms to publishers, but centralising cookie consent would completely undermine the objectives of that legislation. It is clear that this Bill should be amended to withdraw the provisions giving the Secretary of State the power to introduce these centralised cookie controls. I very much hope that the Minister will have second thoughts, given the weight of opinion and the impact that the Secretary of State’s powers would have.
My Lords, if the Committee will indulge me, I was a little late arriving for the introduction to this group of amendments by my noble friend Lord Lucas, but I heard most of what he said and I will speak briefly. I am quite sympathetic to the arguments about the exemption being too tightly drawn and the advantage that this is likely to give the likes of Google and Meta in the advertising ecology. As the noble Lord, Lord Clement-Jones, said, a range of different trade bodies have raised concerns about this, certainly with me.
From my perspective, the other point of interest that I want to flag is that the Communications and Digital Committee is currently doing an inquiry into the future of news. As part of the evidence that we have taken in that inquiry, one of our witnesses from the news industry raised their concerns about a lack of joined-up thinking, as they described it, within government when it comes to various different bits of legislation in which there are measures that are inadvertently detrimental to the news or publishing industry because there has been no proper understanding or recognition of how the digital news environment is now so interconnected. Something like this, on cookies, could have quite a profound effect on the news and publishing industry, which we know is reliant on advertising and is increasingly feeling the pinch because the value that it gets from digital advertising is being squeezed all the time. I just wanted to reinforce the point, for the benefit of my noble friend the Minister, that concern about this is widespread and real.
I am interested in the Minister’s point about the flexibility the Government see in this clause, but I am not sure who in the end has the responsibility to lead on that flexibility. Will it come from the commissioner or be driven by the Secretary of State’s considerations? The consultation duties seem very dependent on the commissioner’s view and I am not sure at what stage the Secretary of State would want to intervene to ensure that they have got this bit right. That is very important, because the balance is quite sophisticated.
The Minister used the expression “when the evidence emerges”, as did the noble Viscount, Lord Camrose, in another context last week. I would have thought that these organisations know what they are about, and they have provided some pretty comprehensive evidence about the impact on their businesses. Is that not a pretty good reason for the Government to think that they might not have this set of provisions entirely right, quite apart from the other aspects of this group of amendments? If that evidence is not enough—I read out the list of organisations—the Government are more or less saying that they will not accept any evidence.
I thank both noble Lords for their interventions. On the point from the noble Lord, Lord Bassam, there is a trifecta of decision-making between the Secretary of State, the ICO and the organisations all working together. That is why there is a consultation requirement before using the power. On the point from the noble Lord, Lord Clement-Jones, it is a question of your point of view; we feel that we have done stakeholder engagement and believe that we have got the balance right between the needs of organisations—
Will the Minister write and unpack exactly what the balance of opinion was? We are talking about pretty crucial stuff here. It is not always a question just of numbers; it is quite often a question of weighting the arguments. The Minister should write to us and tell us how they came to that conclusion, because the case was clearly being made during the consultation, but the Government have effectively ignored it.
In this tripartite geography that the noble Lord described, the power—
I am not a gambling man. It is an interesting term. The Minister is suggesting that power rests equally among those three elements but it does not. The Secretary of State is the all-powerful being and the commissioner is there to ensure that regulation works effectively. How will this operate in practice? There is no advisory body here; it is the Secretary of State having a discussion with the commissioner and then, on the balance of some of the consultation information that comes in, making a decision. That will not enable the sector, the market and those providers to be engaged.
I thank noble Lords for those further points requesting clarification. On how we have come to this decision, I am happy to write to all noble Lords in the Committee. The noble Lord went in an interesting direction because, in the context of the rest of the Bill, so many of the amendments have been about protecting private users, but the noble Lord seems to be swaying more in favour of the advertisers here.
My Lords, it is all about the relative importance and the weighting. Maybe that is a good illustration of where the Government are not getting their weighting correct for the beginning and this part of the Bill.
I take the noble Lord’s point. We are working with industry and will continue to do so. For the benefit of the Committee, we are, as I said, happy to write and explain the points of view, including those from Data: A New Direction. In response to the noble Lord, Lord Bassam, power ultimately lies with Parliament via the affirmative resolution procedure for the Secretary of State power.
I will go back to the amendments we were discussing. This regulation applies to complex and technical markets. The very reason we have taken a delegated power is so that the new exemptions can be carefully created in consultation with all affected stakeholders. As I explained, the Bill includes a requirement to consult the Information Commissioner, the Competition and Markets Authority and any other relevant stakeholders, which would include trade associations and consumers or web users.
Amendment 201 would widen the application of the “strictly necessary” exemption. Currently, it applies only to those purposes essential to provide the service requested by the user. Amendment 201 would extend this exemption so that it applies to the purposes considered essential to the website owner. We do not think this would be desirable, as it would reduce a user’s control over their privacy in a way that they might not expect.
For the reasons I have set out—and once again reaffirming the commitment to write to noble Lords on how the weighting was worked out—I hope my noble friend and the noble Baroness will not press their amendments.
My Lords, I support Amendment 208A. I declare my interest as a solicitor but not one who has been directly involved with personal injury claims. This is an area of particular specialism that requires particular expertise and experience for it to be carried out to the best advantages of those who seek that help.
Looking back, I am concerned that this matter has been raised, in different fora, on a number of occasions. For instance, in 2016, the Telephone Preference Scheme opt-out was discussed when it was removed from the control of Ofcom to that of the ICO. At that point, there was a great opportunity for this matter to be dealt with. Indeed, a number of organisations, including personal injury lawyers, the Motor Accident Solicitors Society and others, said that it was vital to carry this out and that cold calling should be ended because of the pressures it placed on an awful lot of very vulnerable people.
Since 2016, things have got worse in one respect—although, perhaps, they are a little less bad in respect of telephone calling. It is a little while now since I was last told that I had just had a major accident in my car as I was sitting enjoying a glass of wine and not having such worries in my mind. Telephone cold calling seems to have diminished but pressures through social media contact, various scams and so on have increased dramatically. I have been told this by a number of my legal colleagues.
In 2023, the Government produced the UK’s Fraud Strategy. As I am sure noble Lords will know, when it was published, it specifically pursued the question of extending the ban on cold calling to personal injury cases; that was very important and included all servers. So, unless there is some relationship already in place—something where that is a defence, as it were, here—and a voluntary willingness on the part of those who suffer from personal injuries to be contacted by an organisation with which they already have a relationship, this is something that we should pursue very strongly indeed.
Although it is correct that the legal profession, and perhaps other professions, are banned from this procedure, on a regulatory or disciplinary basis, some of my colleagues in the profession are, in some cases, susceptible to financial and commercial challenges through these organisations, such that they would become—sometimes, almost inadvertently—part of the process. Therefore, I hope that, in passing such an amendment, we would give a clear sign to the Solicitors Regulation Authority and the Law Society that it underlines yet again that these practices are not acceptable to those members of the profession.
My Lords, I support Amendment 208A. I am a recovering solicitor. Many moons ago, I gave public affairs advice to the Association of Personal Injury Lawyers, which is a fine organisation. I very much support its call and this amendment on that basis. I congratulate the noble Lord, Lord Leong, on his introduction to this amendment; he and the noble Lord, Lord Kirkhope, made a terrific case.
APIL took the trouble to commission research from YouGov, which showed that 38% of UK adults had received a cold call or text while 86% had a strong emotional response and were left feeling annoyed, angry, anxious, disgusted or upset. Therefore, the YouGov research reveals that almost all those who received a call supported a total ban on personal injury cold calls and text messages.
There is little for me to add but I am sorry that the noble Baroness, Lady Buscombe, is not with us—she has just exited the Room, which is unhappy timing because, in looking back at some of the discussions we have had in the House, I was about to quote her. During Report stage in the Lords on the Financial Guidance and Claims Bill, when she was a Minister, she told us:
“We know that cold calls continue and understand that more needs to be done truly to eradicate this problem. We have already committed to ban cold calls relating to pensions, and are minded to bring forward similar action in relation to the claims management industry. I have asked officials to consider the evidence for implementing a cold-calling ban in relation to claims management activities, and I am pleased to say that the Government are working through the detail of a ban on cold calling by claims management companies. There are complex issues to work through, including those relating, for example, to EU directives”;
of course, we do not have those any more. She went on to say:
“We would therefore like time to consider this important issue properly, and propose bringing forward a government amendment in the other place to meet the concerns of this House”.—[Official Report, 24/10/17; col. 861.]
How much time do the Government need? Talk about unfinished business. I know it is slightly unfair as you can unearth almost anything in Hansard but the fact is that this is bull’s eye. It is absolutely spot on on the part of APIL to have found this. I thought for one delirious minute that the noble Baroness, Lady Buscombe, was going to stand up and say, “Yes, I plead guilty. We never pursued this”.
I have texted the noble Baroness asking her to return as soon as possible so that she can listen to the noble Lord’s wise words.
I am not going to carry on much longer. I know that that will be a grave disappointment but it makes the case, I think, that it is high time that the Government did something in this area. It is clearly hugely unpopular. We need to make sure that Amendment 208A is passed. If not now, when?
My Lords, I thank the noble Baroness, Lady Jones of Whitchurch, for tabling Amendment 208A and the noble Lord, Lord Leong, for moving it. This amendment would insert new Regulation 22A into the privacy and electronic communications regulations and would prohibit via email or text unsolicited approaches encouraging people to commence personal injury claims sent by, or on behalf of, claims management companies.
The Government agree that people should not receive unsolicited emails and texts from claims management companies encouraging them to make personal injury claims. I assure noble Lords that this is already unlawful under the existing regulations. Regulation 22(2) prohibits the sending of all unsolicited electronic communications direct marketing approaches—including, but not limited to, texts and emails—unless the recipient has previously consented to receiving the communication. Regulation 21A already bans live calling by claims management companies.
In the past year, the Information Commissioner has issued fines of more than £1.1 million to companies that have not adhered to the direct marketing rules. Clause 117 considerably increases the financial penalties that can be imposed for breaches of the rules, providing a further deterrent to rogue claims management and direct marketing organisations.
Amendments 211 and 215 relate to Clause 116 so I will address them together. Amendment 211 seeks to confirm that a provider of a public electronic communications service or network is not required to intercept or examine the content of any communication in order to comply with the new duty introduced by Clause 116. I assure the noble Baroness and the noble Lord that the duty is a duty to share information only. It merely requires providers to share any information that they already hold or gather through routine business activities and which may indicate suspicious unlawful direct marketing on their networks; it does not empower, authorise or compel a communications provider to intercept messages or listen to phone calls.
Should a communications provider become aware of information through its routine business activities that indicates that unlawful direct marketing activity may be taking place on its service or network, this duty simply requires it to share that information with the Information Commissioner. For example, a communications provider may receive complaints from its subscribers who have received numerous unsolicited direct marketing communications from a specific organisation. We know from the public consultation that people want action taken against nuisance calls and spam, and this duty will support that.
My Lords, in moving Amendment 209, I will also speak to Amendment 210, and I thank the noble Lord, Lord Clement-Jones, for adding his support.
These amendments return to the major debate that we had on day 2 in Committee regarding direct marketing for the use of democratic engagement. It is fair to say that no-one was convinced by the Minister’s arguments about why that relaxation of the rules for political parties was necessary. We will no doubt return to that issue on Report, so I shall not repeat the arguments here. Meanwhile, Clause 113 leads into the democratic engagement provisions in the Bill and provides a soft opt-in for the use of electronic mail for direct marketing for charitable, political or other non-commercial activities when the data has been collected for other purposes.
As we made clear in the previous debate, we have not asked for these more relaxed rules about political electronic marketing. We believe that these provisions take us fundamentally in the wrong direction, acting against the interests of the electorate and risking damaging the already fragile level of trust between politicians and voters. However, we support extending the soft opt-in for charities and other non-commercial organisations. This is a measure that many charities have supported.
Of course, we want to encourage campaigning by charitable organisations to raise awareness of the critical issues of the day and encourage healthy debate, so extending their opportunities to use electronic marketing for this purpose could produce a healthy boost for civic engagement. This is what our amendments are hoping to achieve.
Therefore, our Amendments 209 and 210 would amend the wording of Clause 113 to remove the relaxation of the rules specifically for political parties and close the loophole by which some political parties may try to negate the provisions by describing themselves as non-commercial entities. We believe that this is the right way forward. Ideally, these amendments would be combined with the removal of the democratic engagement provisions in Clause 114 that we have already debated.
I hope noble Lords will see the sense of these proposals and that the Minister will agree to take these amendments away and rethink the whole proposition of Clauses 113 and 114. I beg to move.
My Lords, tracking the provenance of Clause 113 has been a very interesting exercise. If we think that Clause 114 is pretty politically motivated, Clause 113 is likewise. These rules relating to the fact that political parties cannot avail themselves of the soft opt-in provision have been there since 2005. The Information Commissioner issued guidance on political campaigning, and it was brought within the rules. Subsequently, there has been a ruling in a tribunal case which confirmed that: the SNP was issued with an enforcement notice and the information tribunal dismissed the appeal.
The Conservative Party was fined in 2021 for sending emails to people who did not ask for them. Then, lo and behold, there was a Conservative Party submission to the House of Lords Democracy and Digital Technologies Committee in 2020, and that submission has been repeated on a number of occasions. I have been trying to track how many times the submission has been made by the Conservative Party. The submission makes it quite clear that there is frustration in the Conservative Party. I have the written evidence here. It says:
“We have a number of concerns about the Information Commissioner’s draft code”—
as it then was: it is now a full code—
“on the use of data for political campaigning. In the interests of transparency, I enclose a copy of the response that the Conservative Party sent to the consultation. I … particularly flag the potential chilling effect on long-standing practices of MPs and councillors from engaging with their local constituents”.
Now, exactly as the noble Baroness has said, I do not think there is any call from other political parties to change the rules. I have not seen any submissions from any other political party, so I would very much like to know why the Government have decided to favour the Conservative Party in these circumstances by changing the rules. It seems rather peculiar.
The guidance for personal data in political campaigning, which I read while preparing for this debate, seems to be admirably clear. It is quite long, but it is admirably clear, and I congratulate the ICO on tiptoeing through the tulips rather successfully. However, the fact is that we have very clear guidance and a very clear situation, and I entirely agree with the noble Baroness that we are wholly in favour of charities being able to avail themselves of the new provisions, but allowing political parties to do so is a bridge too far and, on that basis, I very much support the amendment.
My Lords, I thank the noble Baroness, Lady Jones, for Amendments 209 and 210, which would amend Clause 113 by removing electronic communications sent by political parties from the scope of the soft opt-in direct marketing rule. A similar rule to this already exists for commercial organisations so that they can message customers who have previously purchased goods or services about similar products without their express consent. However, the rule does not apply if a customer has opted out of receiving direct marketing material.
The Government consider that similar rules should apply to non-commercial organisations. Clause 113 therefore allows political parties, charities and other non-commercial organisations that have collected contact details from people who have expressed an interest in their objectives to send them direct marketing material without their express consent. If people do not want to receive political messaging, we have included several privacy safeguards around the soft opt-in measure that allow people to easily opt out of receiving further communications.
Support for a political party’s objectives could be demonstrated, for example, through a person’s attendance at a party conference or other event, or via a donation made to the party. In these circumstances, it seems perfectly reasonable for the party to reach out to that person again with direct marketing material, provided that the individual has not objected to receiving it. I reassure the Committee that no partisan advantage is intended via these measures.
My Lords, perhaps the Minister could elucidate exactly what is meant by “supporting the party’s objectives”. For instance, if we had a high street petition, would that be sufficient to grab their email address and start communicating with them?
I suppose it would depend on the petition and who was raising it. If it were a petition raised or an activity supported by a particular party, that would indicate grounds for a soft opt-in, but of course anyone choosing not to receive these things could opt out either at the time or later, on receipt of the first item of material.
So what the Minister is saying is that the solicitor, if you like, who is asking you to sign this petition does not have to say, “Do you mind if I use your email address or if we communicate with you in future?” The person who is signing has to say, “By the way, I may support this local campaign or petition, but you’re not going to send me any emails”. People need to beware, do they not?
Indeed. Many such petitions are of course initiated by charitable organisations or other not-for-profits and they would equally benefit from the soft opt-in rule, but anyone under any of those circumstances who wished not to receive those communications could opt out either at the time or on receipt of the first communication on becoming aware that they were due to receive these. For those reasons, I hope that the noble Baroness will not press her amendments in relation to these provisions.
It is important that my noble friend answers that question. The point is that if we find—I am sorry, I still speak as if I am involved with it, which I am not, but I promise noble Lords that I have spent so much time in this area. If the DWP finds that there is a link that needs pursuing then that obviously has to be opened up to some degree to find what is going on. Remember, the most important thing about this is that the right people get the right benefits. That is what the Government are trying to achieve.
My Lords, I note that the DWP has been passed a parcel by the Department for Science, Innovation and Technology—and I am not at all surprised. I am sure it will be extremely grateful to have the noble Baroness, Lady Buscombe, riding to its defence today as well. Also, attendance at this debate demonstrates the sheer importance of this clause.
We on these Benches have made no secret that this is a bad Bill—but this is the worst clause in it, and that is saying something. It has caused civil society organisations and disability and welfare charities to rise as one against it, including organisations as disparate as UK Finance, mentioned by the noble Lord, Lord Davies, and the ICO itself. They have gone into print to say that, for this measure to be deemed a necessary and proportionate interference in people’s private lives, to be in accordance with the law and to satisfy relevant data protection requirements, legislative measures must be drafted sufficiently tightly—et cetera. They have issued a number of warnings about this. For a regulator to go into print is extremely unusual.
Of course, we also have Big Brother Watch and the Child Poverty Action Group—I pay tribute to the noble Baroness, Lady Lister—the National Survivor User Network, Disability Rights UK, the Greater Manchester Coalition of Disabled People and the Equality and Human Rights Commission. We have all received a huge number of briefings on this. This demonstrates the strong feelings, and the speeches today have demonstrated the strong feelings on this subject as well.
There have been a number of memorable phrases that noble Lords have used during their speeches. The noble Baroness, Lady Kidron, referred to a “government fishing expedition”. The noble Baroness, Lady Chakrabarti, called it “breathtaking in its scope”. I particularly appreciated the speech of the noble Lord, Lord Kamall, who said, “What happened to innocence?” In answer to the noble Baroness, Lady Buscombe, this is not “nuanced”: this is “Do you require suspicion or do you not?” That seems to me to be the essence of this.
I was in two minds about what the noble Lord, Lord Sikka, said. I absolutely agree with him that we need to attack the fat cats as much as we attack those who are much less advantaged. He said, more or less, “What is sauce for the goose is sauce for the gander”. The trouble is that I do not like the sauce. That was the problem with that particular argument. The noble Baroness, Lady Lister, talked about stigma. I absolutely agree. The noble Lord, Lord Vaux, more or less apologised for using the word “draconian” at Second Reading, but I thought the word “overreach” was extremely appropriate.
We have heard some powerful speeches against Clause 128. It is absolutely clear that it was slipped into the Bill alongside 239 other amendments on Report in the Commons. I apologise to the Committee, but clearly I need to add a number of points as well, simply to put on record what these Benches feel about this particular clause. It would introduce new powers, as we have heard, to force banks to monitor all bank accounts to find welfare recipients and people linked to those payments. We have heard that that potentially includes landlords and anyone who triggers potential fraud indicators, such as frequent travel or savings over a certain amount. We have seen that the impact assessment indicates that the Government’s intention is to “initially”—that is a weasel word—use the power in relation to universal credit, pension credit and employment support allowance. We have also heard that it could be applied to a much wider range of benefits, including pensions. The Government’s stated intent is to use the power in relation to bank accounts in the first instance, but the drafting is not limited to those organisations.
Of course, everyone shares the intent to make sure that fraudulent uses of public money are dealt with, but the point made throughout this debate is that the Government already have power to review the bank statements of welfare fraud suspects. Under current rules, the DWP is able to request bank account holders’ bank transaction details on a case-by-case basis if there are reasonable grounds to suspect fraud. That is the whole point. There are already multiple powers for this purpose, but I will not go through them because they were mentioned by other noble Lords.
This power would obviously amend the Social Security Administration Act to allow the DWP to access the personal data of welfare recipients by requiring the third party served with a notice, such as a bank or building society, to conduct mass monitoring without suspicion of fraudulent activity, as noble Lords have pointed out. Once issued, an account information notice requires the receiver to give the Secretary of State the names of the holders of the accounts. In order to do this, the bank would have to process the data of all bank account holders and run automated surveillance scanning for benefit recipients, as we have heard.
New paragraph 2(1)(b) states that an account information notice requires,
“other specified information relating to the holders of those accounts”,
and new paragraph 2(1)(c) refers to other connected information, “as may be specified”. This vague definition would allow an incredibly broad scope of information to be requested. The point is that the Government already have the power to investigate where there is suspicion of fraud. Indeed, the recently trumpeted prosecution of a number of individuals in respect of fraud amounting to £53.9 million demonstrates that. The headlines are in the Government’s own press release:
“Fraudsters behind £53.9 million benefits scam brought to justice in country’s largest benefit fraud case”.
So what is the DWP doing? It is not saying, “We’ve got the powers. We’ve found this amount of fraud”. No, it is saying, “We need far more power”. Why? There is absolutely no justification for that. No explanation is provided for how these new surveillance powers will be able to differentiate between different kinds of intentional fraud and accidental error.
We have heard about the possibility and probability of automated decision-making being needed here. I do not know what the Minister will say about that, but, if there will not be automated decision-making—that is concerning enough—if the DWP chooses to make these decisions through human intervention the scale of the operation will require a team so large that this will be an incredibly expensive endeavour, defeating the money-saving mandate underpinning this proposed new power, although, as a number of noble Lords have pointed out, we do not know from any impact assessment what the Government expect to gain from this power.
It is wholly inappropriate for the Government to order private banks, building societies and other societies and financial services to conduct mass algorithmic suspicionless surveillance and reporting of their account holders on behalf of the state in pursuit of these policy aims. It would be dangerous for everyone if the Government reversed the presumption of innocence. This level of financial intrusion and monitoring affecting millions of people is highly likely to result in serious mistakes and sets an incredibly dangerous precedent.
This level of auditing and insight into people’s private lives is a frightening level of government overreach, in the words of the noble Lord, Lord Vaux, more so for some of the most marginalised in society. This will allow disproportionate and intrusive surveillance of people in the welfare system. In its impact statement, the DWP says it will ensure that data will be
“transferred, received and stored safely”.
That is in contrast to the department’s track record of data security, particularly considering that it was recently reprimanded by the ICO for data leaks so serious that they were reported to risk the lives of survivors of domestic abuse. With no limitations set around the type of data the DWP can access, the impact could be even more obscure.
We have heard about the legal advice obtained by Big Brother Watch. It is clear that, on the basis that,
“the purpose of the new proposed powers is to carry out monitoring of bank accounts”
and that an account information notice can be issued
“where there are no ‘reasonable grounds’ for believing a particular individual has engaged in benefit fraud or has made any mistake in claiming benefits”,
this clause is defective. It also says that
“financial institutions would need to subject most if not all of their accountholders to algorithmic surveillance”;
that this measure
“will be used not just in relation to detection of fraud but also error”;
and that this measure
“would not be anchored in or constrained by anything like the same legal and regulatory framework”
as the Investigatory Powers Act. It concludes:
“The exercise of the financial surveillance/monitoring powers contained in the DPDIB, as currently envisaged, is likely to breach the Article 8 rights of the holders of bank accounts subject to such monitoring”
in order to comply. It is clear that we should scrap this clause in its entirety.
That is a fair challenge and I will certainly be coming on to that. I have in my speech some remarks and a much more limited reassurance for the noble Lord.
It is only when there is a signal of potential fraud or error that the DWP may undertake a further review, using our business-as-usual processes and existing powers—an important point. DWP will not share any personal information with third parties under this power, and only very limited data on accounts that indicate a potential risk of fraud or error will be shared with DWP in order to identify a claimant on our system. As I said earlier, I will say more about the limited aspects of this later in my remarks.
I am sorry to interrupt the Minister, but will he be coming on to explain what these signals are? He is almost coming to a mid-point between innocence and suspicion called “signals”—is this a new concept in law? What are we talking about and where in all of Schedule 11 is the word “signal”?
If the noble Lord will allow me, I would like to make some progress and I hope that this will come out in terms of what we may be seeking on a limited basis.
The first third parties that we will designate will be banks and other financial institutions, as the Committee is aware. We know that they hold existing data that will help to independently verify key eligibility factors for benefits.
This clause does not give DWP access to any bank accounts—a very important point—nor will it allow DWP to monitor how people spend their money or to receive sensitive information, such as medical records or data on opinions or beliefs.
As the noble Baroness, Lady Sherlock, mentioned—I want to try to answer one of her questions—this power cannot be used to suspend someone’s benefit. Cases that are flagged must be reviewed under existing processes and powers—business as usual, which I mentioned earlier—to determine whether incorrect payments are being made.
Our approach is not new. HMRC has long been using powers to request data at scale from banks on all taxpayers under Schedule 23 to the Finance Act 2011. Our approach carries similar safeguards. Tax fraud is no different from welfare fraud and should be treated similarly. This was a key point that the Prime Minister made only on Friday when he committed to bring DWP’s fraud and error powers more in line with those of HMRC. This is one clear area where we are seeking to do this.
This allows me to go on to very important points about safeguards. Not all the cases found through this power will be fraud. Some will be errors which the power will help to correct, preventing overpayment debt building up. Some cases may also have legitimate reasons for seemingly not meeting eligibility requirements, for example where claimants have certain compensation payments that are disregarded for benefit eligibility rules. In those cases, no further action will be taken. Our robust business-as-usual processes will ensure that all cases are dealt with appropriately.
Another question raised by the noble Lord, Lord Vaux, on safeguards was to do with the legislation. A key safeguard is that we cannot approach any third party either; there must be a three-way relationship with the department, the claimant and the third party. This safeguard will narrow the use of this power substantially and ensure that it is used proportionately, as these three-way relationships are limited, meaning that data cannot be gathered at scale from just any source for any purpose. Any third party we will want to get data from will need to be designated in affirmative regulations that noble Lords will have an opportunity to scrutinise. These regulations will be accompanied by a code of practice. We will be bringing that forward, and we will consult on the code before presenting it to Parliament—which answers a question raised by, I think, the noble Baroness, Lady Kidron.
The power also ensures that we can request only very limited data on benefit recipients. I think this addresses a point raised by the noble Lord, Lord Vaux. We must work with key third parties to define what is shared, but our expectation is that this would be a name and date of birth or a unique payment number, along with the eligibility criteria someone has matched against: for example, a benefit claimant who has more savings than the benefit rules would normally allow.
Outside controls will apply here, too. DWP already handles vast amounts of data, including personal data, and must adhere to the UK GDPR and the Data Protection Act 2018.
On the point, which again was raised during this debate, about the remarks made by the Information Commissioner’s Office and its updated report on this measure, published as Committee started and which the Committee may be aware of, I was pleased to see that the commissioner now acknowledges that the third-party data measure is in pursuit of a legitimate aim, stating:
“I understand and recognise the scale of the problem with benefit fraud and error that government is seeking to address and accept that the measure is in pursuit of a legitimate aim. I am not aware of any alternative, less intrusive, means of achieving the government’s stated policy intent based on their analysis”.
I think that is a significant point to make, and it is a point with which I very strongly agree.
It is also worth pointing out that the paragraph I quoted follows immediately on that. That is the qualification that I quoted.
Yes, I am aware of that. I think the noble Lord was alluding to the point about proportionality. I listened carefully and took note of that, but do not entirely agree with it. I hope that I can provide further reassurances, if not now then in the coming days and weeks. The point is that there is no other reasonable way to independently verify claimants’ eligibility for the payment that they are receiving.
I turn to the amendments raised, starting with the stand part notice from the noble Baronesses, Lady Kidron and Lady Chakrabarti, the noble Lord, Lord Anderson of Ipswich, who is not in his place, and the noble Lord, Lord Clement-Jones. They and my noble friend Lord Kamall, who is not in his place, interestingly, all made their case for removing the clause, of which I am well aware. However, for the reasons that I just set out, this clause should stand part of the Bill.
In raising her questions, the noble Baroness, Lady Kidron, made some comparisons with HMRC. There are appropriate safeguards in place for this data-gathering power, which will be included in the code of practice. The safeguards for this measure will be equivalent to those in place for the similar HMRC power which Parliament approved in the Finance Act 2011.
The noble Lord has set me quite a challenge at the Dispatch Box. It is out of scope of today’s session but, having said that, I will reflect on his question afterwards.
I am aware that time is marching on. My noble friend Lord Kamall asked about burdens on banks. We believe that the burdens on banks will be relatively low.
The noble Baroness, Lady Sherlock, made a number of points; I may have to write to her to expand on what I am about to say. Removing the requirement for third parties to provide legible copies of information means that DWP could receive the information but there is a risk that the information is not usable; that is my answer to her points. This could limit the data that DWP receives and prevent us utilising the power in full, which could in turn impact the savings due to be realised from this important measure.
I turn to the final amendments in this group, which were raised by the noble Baroness. They would place requirements on the Secretary of State to issue statements in the House and consult on the code of practice. We will talk more about the code of practice later on in this debate, and I have already made clear my firm opinions on it: we will take it forward and are already working on it. There will be a consultation that will, of course, allow anybody with an interest in this to give their views.
I turn to the number of statements that must be made in the House regarding the practical use of the measures before powers can commence, such as the role that artificial intelligence will play or assurances on any outsourcing of subsequent investigations. This is an important point to make and was raised by other Peers. I want to make it clear that this measure will be rolled out carefully and slowly through a “test and learn” approach from 2025, in conjunction with key third parties. To make these statements in the House would pre-empt the crucial “test and learn” period. I say again that discussions with the third parties are deep and detailed and we are already making progress; this point was made by the noble Lord, Lord Clement-Jones, on the link with banks and third parties.
Importantly, I assure the noble Baroness, Lady Sherlock, that we will not make any automated decisions off the back of this power; this was also raised by the noble Baroness, Lady Kidron. The final decision must and will always involve a human being—a human agent in these cases—and any signals of potential fraud or error will be looked at comprehensively. I am grateful for the remarks of my noble friend Lady Buscombe on this matter.
I know that I have not answered a number of questions. Perhaps I can do so in our debate on another group; otherwise, I certainly wish to answer them fully in a letter. I hope that I have explained clearly, from our perspective, why this power is so important; why it is the right power to take; and how we have carefully designed it, and continue to design it, with the key safeguards in mind. I strongly value the input from all those who have contributed today but I remain unconvinced that the proposed amendments are necessary and strengthen the power beyond the clear safeguards I have set out. With that, I hope that the noble Baroness will not press her opposition to Clause 128.
I may have missed something, but can I just check that the Minister will deal with the matter of signals, which he mentioned at the beginning of his response? Will he deal with where that phrase comes from, what they are, whether they will be in the code, et cetera? There are a lot of questions around that. Does it amount to actual suspicion?
Absolutely; I am keen to make sure that I answer on that. It may be possible to do so in the next group but, if not, I will certainly do so in the form of a precise letter—added to the larger letter that I suspect is coming the noble Lord’s way.
My Lords, I was not intending to speak on this group, but another question occurs to me. We have been assuming throughout this that we are talking about requests of information to banks, but the Bill actually says that:
“The Secretary of State may give an account information notice to a person of a prescribed description”.
Could the Minister explain what that is?
My Lords, I would of course much prefer Clause 128 not to stand part, but we were just privileged by a master class from the noble Baroness, Lady Sherlock. She talked about these being probing amendments, but I do not think that I have seen a schedule so expertly sliced and diced before. If those are probing, they are pretty lethal. I agree with so many of those elements. If we are to have provisions, those are the kinds of additions that we would want and the questions that we would want to ask about them. I very much hope that the Minister has lots of answers, especially for the noble Baroness, Lady Sherlock, but also for the other noble lords who have spoken.
My Lords, the debate on this group has focused largely on the amendments from the noble Baroness, Lady Sherlock, regarding using powers only where there is a suspicion of fraud, making provisions so that information collected can be used only for the narrow purpose of determining overpayment, removing pension-age benefits from the scope of the powers and requiring approval from Parliament before the power can be used on specific working-age benefits.
I was going to go over the reason behind these measures once again, but I will not delay the Committee on why we are bringing them forward. I believe I did that at some length in the previous group, so I am going to turn to the amendments raised.
Narrowing these powers as suggested by the noble Baroness, with Amendments 220, 221, 222 and 222A, will leave us exposed to those who are deliberately aiming to defraud the welfare system and undermine the policy intent of this measure. In fact, taken together, these amendments would render the power unworkable and ineffective.
To restrict the power to cases where DWP already has a suspicion of fraud, as suggested by the noble Baroness, would defeat the purpose of this measure. The intent is to enable us to use data from third parties to independently check that benefit eligibility rules are being complied with. We use data from other sources to do this already. For example, we use data from HMRC to verify earnings in UC and check that the benefit eligibility rules are being complied with. Parliament has determined that, to be eligible for a benefit, certain rules and requirements must be met, and the Government have a responsibility to ensure that taxpayers’ money is spent responsibly. Therefore, the DWP should be able to utilise information from third parties to discharge that duty. This is an appropriate and proportionate response to a significant fraud and error challenge.
The noble Baroness, Lady Sherlock, also proposed that the power should be restricted such that it would not apply to persons who hold an account into which a benefit is paid on behalf of someone who cannot manage their own financial affairs—such persons are referred to as “appointees”. An appointee is a person who may be appointed by the Secretary of State to act on behalf of the benefit customer. Usually, the appointee becomes legally responsible for acting on the customer’s behalf in all matters related to the claim. It is also made clear to the appointee, in the documents that they sign, that we may get information about them or the person they are acting for from other parties, or for any other purposes that the law allows, to check the information they provide.
Under our proposed legislation, it is right to say that there may be some people who are not themselves benefit claimants but who have given a person permission to pay benefits into their bank account, who may be picked up in the data returned by third parties. Under the noble Baroness’s amendment, we would not be able to gather data on appointees, which would make the power unworkable, because third parties would not be able to distinguish between an individual managing their own benefit and an appointee. It also assumes that no fraud or error can occur in these cases, which is definitely wrong. I assure the noble Baroness that we handle such cases regularly and have robust existing processes for identifying appointees on our own database and for carefully handling cases of this nature.
The noble Baroness would also like to see the power—
I clearly cannot go far enough today, but, because this is important and we are in Committee, I need to give some further reassurances on where we are in the process in terms of filtering. If I may conclude my remarks, I will finish this particular point. This is all part of the test and learn, and I give some reassurance that we are working through these important issues in relation to appointees and landlords.
It is precisely as the noble Baroness, Lady Kidron, said on the last group—this is a massive net. It feels as though this is so experimental that there is no certainty about how it will operate, and the powers are so broad that anything could be subject to it. It sounds extremely dangerous, and it is no wonder that everybody is so concerned.
I do not agree with that. We have done quite a lot of business together across the Chamber. That is a slightly sweeping issue, because I have given some reassurance that we are already working with the third parties to make sure that we have robust processes in place. For instance, when we are talking about landlords, while it is possible that a landlord’s account may be matched under the measure, only minimum information will be provided by the third parties to enable my department to identify an individual within our own database. With all the data received, we will make further inquiries only where appropriate and where the information is relevant to the benefit claim. This is already part of our business-as-usual processes.
My Lords, I am sorry to interrupt the Minister but, throughout these two groups, he has, in a sense, introduced wholly new concepts. We have “test and learn”, “filtering”—which sounds extraordinary—and “signals” but none seem to be in the black letter of the schedule, nor in the rest of the Bill. We have a set of intentions and we are meant to trust what the DWP is doing with these powers. Does the Minister not recognise that the Committee is clearly concerned about this? It needs tying down, whether we need to start from scratch and get rid of the clause or take on board the amendments put forward by the noble Baroness, Lady Sherlock. The uncertainty around this is massive.
Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Science, Innovation & Technology
(8 months ago)
Grand CommitteeMy Lords, it has been a privilege to be at the ringside during these three groups. I think the noble Baroness, Lady Sherlock, is well ahead on points and that, when we last left the Minister, he was on the ropes, so I hope that to avoid the knock- out he comes up with some pretty good responses today, especially as we have been lucky enough to have the pleasure of reading Hansard between the second and third groups. I think the best phrase that noble Baroness had was the “astonishing breadth” of Clause 128 and Schedule 11 that we explored with horror last time. I very much support what she says.
The current provisions seem to make the code non-mandatory, yet we discovered they are without “reasonable suspicion”, the words that are in the national security legislation—fancy having the Home Office as our model in these circumstances. Does that not put the DWP to shame? If we have to base best practice on the Home Office, we are in deep trouble.
That aside, we talked about “filtering” and “signals” last time. The Minister used that phrase twice, I think, and we discovered about “test and learn”. Will all that be included in the code?
All this points to the fragility and breadth of this schedule. It has been dreamt up in an extraordinarily expansive way without considering all the points that the noble Lord, Lord Anderson, has mentioned, including the KC’s opinion, all of which point to the fact that this schedule is going to infringe Article 8 of the European Convention on Human Rights. I hope the Minister comes up with some pretty good arguments.
My final question relates to the impact assessment–or non-impact assessment. The Minister talked about the estimate of DWP fraud, which is £6.4 billion. What does the DWP estimate it will be after these powers are implemented, if they are ever implemented? Should we not have an idea of the DWP’s ambitions in this respect?
My Lords, this has been a somewhat shorter debate than we have been used to, bearing in mind Monday’s experience. As with the first two groups debated then, many contributions have been made today and I will of course aim to answer as many questions as I can. I should say that, on this group, the Committee is primarily focusing on the amendments brought forward by the noble Baroness, Lady Sherlock, and I will certainly do my very best to answer her questions.
From the debate that we have had on this measure, I believe that there is agreement in the Committee that we must do more to clamp down on benefit fraud. That is surely something on which we can agree. In 2022-23, £8.3 billion was overpaid due to fraud and error in the benefit system. We must tackle fraud and error and ensure that benefits are paid to those genuinely entitled to the help. These powers are key to ensuring that we can do this.
I will start by answering a question raised by the noble Lord, Lord Anderson—I welcome him to the Committee for the first time today. He described himself as a “surveillance nerd”, but perhaps I can entreat him to rename himself a “data-gathering nerd”. As I said on Monday, this is not a surveillance power and suggesting that it is simply causes unnecessary worry. This is a power that enables better data gathering; it is not a surveillance or investigation power.
The third-party data measure does not allow the DWP to see how claimants spend their money, nor does it give the DWP access to millions of people’s bank accounts, as has been inaccurately presented. When the DWP examines the data that it receives from third parties, this data may suggest that there is fraud or error and require a further review. This will be done through our normal, regular, business-as-usual processes to determine whether incorrect payments are indeed being made. This approach is not new. As alluded to in this debate, through the Finance Act 2011, Parliament has already determined that this type of power is proportionate and appropriate, as HMRC already owns similar powers regarding banking institutions and third parties in relation to all taxpayers.
I listened very carefully to the noble Lord and will, however, take back his points and refer again to our own legal team. I think the point was made about the legality of all this. It is a very important point that he has made with all his experience, and I will take it back and reflect on it.
That is a very fair question, and I hope that I understand it correctly. I can say that the limit for the DWP is that it can gain only from what the third party produces. Whatever goes on behind the doors of the third party is for them and not us. Whether there is a related account and how best to operate is a matter for the bank to decide. We may therefore end up getting very limited information, in terms of the limits of our powers. I hope that helps, but I will add some more detail in the letter.
My Lords, the Minister extolled the green-rated nature of this impact assessment. In the midst of all that, did he answer my question?
I asked about the amount of fraud that the Government plan to detect, on top of the £6.4 billion in welfare overpayments that was detected last year.
The figure that we have is £600 million but, again, I will reflect on the actual question that we are looking to address—the actual amount of fraud in the system.
The Minister is saying that that figure is not to be found in this green-rated impact assessment, which most of us find to be completely opaque.
I will certainly take that back, but it is green rated.
My Lords, we have talked about proportionality and disproportionality throughout the debate on this Bill. Is it not extraordinary that that figure is not on the table, given the extent of these powers?
My Lords, the Minister was kind enough to mention me a little earlier. Can I just follow up on that? In the impact assessment, which I have here, nowhere can I find the £600 million figure, nor can I find anywhere the costs related to this. There will be a burden on the banks and clearly quite a burden on the DWP, actually, if it has got to trawl through this information, as the noble Viscount says, using people rather than machines. The costs are going to be enormous to save, it would appear, up to £120 million per year out of £6.4 billion per year of fraud. It does seem odd. It would be really helpful to have those cost numbers and to understand in what document they are, because I cannot find in the impact assessment where these numbers are.
I hope I can help both noble Lords. Although I must admit that I have not read every single page, I understand that the figure of £500 million is in the IA.
Yes, £500 million. I mentioned £600 million altogether; that was mentioned by the OBR, which had certified this, and by the way, that figure was in the Autumn Statement.
My Lords, has not that demonstrated the disproportionality of these measures?
The noble Viscount explained in response to the noble Lord, Lord Anderson, that at every stage where the powers are going to be expanded, it would come back as an affirmative regulation. I might have been a bit slow about this, but I have been having a look and I cannot see where it says that. Perhaps he could point that out to me, because that would provide some reassurance that each stage of this is coming back to us.
I understand, very quickly, that it is in paragraph 1(1), but again, in the interests of time, maybe we could talk about that outside the Room.
I can reassure the noble Lord that that is the case, yes.
I reassure noble Lords that is correct—it is paragraph 1(1). It may be rather complex, but it is in there, just to reassure all noble Lords.
I am sorry to keep coming back, but did the Minister give us the paragraph in the impact assessment that referred to £500 million?
No, I did not, but that is something which surely we can deal with outside the Room. However, I can assure noble Lords that it is in there.
My Lords, I want briefly to contribute to this debate, which I think is somewhat less contentious than the previous group of amendments. As somebody, again, who was working on the Online Safety Act all the way through, I really just pay tribute to the tenacity of the noble Baroness, Lady Kidron, for pursuing this detail—it is a really important detail. We otherwise risk, having passed the legislation, ending up in scenarios where everyone would know that it was correct for the data-gathering powers to be implemented but, just because of the wording of the law, they would not kick in when it was necessary. I therefore really want to thank the noble Baroness, Lady Kidron, for being persistent with it, and I congratulate the Government on recognising that, when there is an irresistible force, it is better to be a movable object than an immovable one.
I credit the noble Viscount the Minister for tabling these amendments today. As I say, I think that this is something that can pass more quickly because there is broad agreement around the Committee that this is necessary. It will not take away the pain of families who are in those circumstances, but it will certainly help coroners get to the truth when a tragic incident has occurred, whatever the nature of that tragic incident.
My Lords, having been involved in and seen the campaigning of the bereaved families and the noble Baroness, Lady Kidron, in particular in the Joint Committee on the Draft Online Safety Bill onwards, I associate myself entirely with the noble Baroness’s statement and with my noble friend Lord Allan’s remarks.
My Lords, I thank the Minister for setting out the amendment and all noble Lords who spoke. I am sure the Minister will be pleased to hear that we support his Amendment 236 and his Amendment 237, to which the noble Baroness, Lady Kidron, has added her name.
Amendment 236 is a technical amendment. It seeks the straightforward deletion of words from a clause, accounting for the fact that investigations by a coroner, or procurator fiscal in Scotland, must start upon them being notified of the death of a child. The words
“or are due to conduct an investigation”
are indeed superfluous.
We also support Amendment 237. The deletion of this part of the clause would bring into effect a material change. It would empower Ofcom to issue a notice to an internet service provider to retain information in all cases of a child’s death, not just cases of suspected suicide. Sadly, as many of us have discovered in the course of our work on this Bill, there is an increasing number of ways in which communication online can be directly or indirectly linked to a child’s death. These include areas of material that is appropriate for adults only; the inability to filter harmful information, which may adversely affect mental health and decision-making; and, of course, the deliberate targeting of children by adults and, in some cases, by other children.
There are adults who use the internet with the intention of doing harm to children through coercion, grooming or abuse. What initially starts online can lead to contact in person. Often, this will lead to a criminal investigation, but, even if it does not, the changes proposed by this amendment could help prevent additional tragic deaths of children, not just those caused by suspected child suicides. If the investigating authorities have access to online communications that may have been a contributing factor in a child’s death, additional areas of concern can be identified by organisations and individuals with responsibility for children’s welfare and action taken to save many other young lives.
Before I sit down, I want to take this opportunity to say a big thank you to the noble Baroness, Lady Kidron, the noble Lord, Lord Kennedy, and all those who have campaigned on this issue relentlessly and brought it to our attention.
My Lords, I will be brief because we very much support these amendments. Interestingly, Amendment 239 from the noble Baroness, Lady Jones, follows closely on from a Private Member’s Bill presented in November 2021 by the Minister’s colleague, Minister Saqib Bhatti, and before that by the right honourable Andrew Mitchell, who is also currently a Minister. The provenance of this is impeccable, so I hope that the Minister will accept Amendment 239 with alacrity.
We very much support Amendment 250. The UK Commission on Bereavement’s Bereavement is Everyone’s Business is a terrific report. We welcome Clause 133 but we think that improvements can be made. The amendment from the noble Baroness, which I have signed, will address two of the three recommendations that the report made on the Tell Us Once service. It said that there should be a review, which this amendment reflects. It also said that
“regulators must make sure bereaved customers are treated fairly and sensitively”
by developing minimum standards. We very much support that. It is fundamentally a useful service but, as the report shows, it can clearly be improved. I congratulate the noble Baroness, Lady Jones, on picking up the recommendations of the commission and putting them forward as amendments to this Bill.
My Lords, I declare an interest as someone who has been through the paper death registration process and grant of probate, which has something to do with why I am in your Lordships’ House, so I absolutely understand where the noble Baroness, Lady Jones of Whitchurch, is coming from. I thank her for tabling these amendments to Clauses 133 and 142. They would require the Secretary of State to commission a review with a view to creating a single digital register for the registration of births and deaths and to conduct a review of the Government’s Tell Us Once scheme.
Clause 133 reforms how births and deaths are registered in England and Wales by enabling a move from a paper-based system of birth and death registration to registration in a single electronic register. An electronic register is already in use alongside the paper registers and has been since 2009. Well-established safety and security measures and processes are already in place with regard to the electronic infrastructure, which have proven extremely secure in practice. I assure noble Lords that an impact assessment has been completed to consider all the impacts relating to the move to an electronic register, although it should be noted that marriages and civil partnerships are already registered electronically.
The strategic direction is to progressively reduce the reliance on paper and the amount of paper in use, as it is insecure and capable of being tampered with or forged. The creation of a single electronic register will remove the risk of registrars having to transmit loose-leaf register pages back to the register office when they are registering births and deaths at service points across the district. It will also minimise the risk of open paper registers being stolen from register offices.
The Covid-19 pandemic had unprecedented impacts on the delivery of registration services across England and Wales, and it highlighted the need to offer more choice in how births and deaths are registered in the future. The provisions in the Bill will allow for more flexibility in how births and deaths are registered—for example, registering deaths by telephone, as was the case during the pandemic. Over 1 million deaths were successfully registered under provisions in the Coronavirus Act 2020. This service was well received by the public, registrars and funeral services.
Measures will be put in place to ensure that the identity of an informant is established in line with Cabinet Office good practice guidance. This will ensure that information provided by informants can be verified or validated for the purposes of registering by telephone. For example, a medical certificate of cause of death issued by a registered medical practitioner would need to have been received by the registrar before an informant could register a death by telephone. Having to conduct a review, as was proposed by the noble Baroness, Lady Jones, would delay moving to digital ways of working and the benefits this would introduce.
My Lords, I thank the Minister for his exposition. He explained the purposes of Clauses 138 to 141 and extolled their virtues, and helpfully explained what my amendments are trying to do—not that he has shot any foxes in the process.
The purpose of my amendments is much more fundamental, and that is to question the methodology of the Government in all of this. The purpose of NUAR is to prevent accidental strikes where building works damage underground infrastructure. However, the Government seem to have ignored the fact that an equivalent service—LinesearchbeforeUdig, or LSBUD—already achieves these aims, is much more widely used than NUAR and is much more cost effective. The existing system has been in place for more than 20 years and now includes data from more than 150 asset owners. It is used by 270,000 UK digging contractors and individuals—and more every day. The fact is that, without further consultation and greater alignment with current industry best practice, NUAR risks becoming a white elephant, undermining the safe working practices that have kept critical national infrastructure in the UK safe for more than two decades.
However, the essence of these amendments is not to cancel NUAR but to get NUAR and the Government to work much more closely with the services that already exist and those who wish to help. They are designed to ensure that proper consultation and democratic scrutiny is conducted before NUAR is implemented in statutory form. Essentially, the industry says that NUAR could be made much better and much quicker if it worked more closely with the private sector services that already exist. Those who are already involved with LinesearchbeforeUdig say, first of all, that NUAR will create uncertainty and reduce safety, failing in its key aims.
The Government have been developing the NUAR since 2018. Claiming that it would drive a reduction in unexpected underground assets being damaged in roadworks, the impact assessment incorrectly states:
“No businesses currently provide a service that is the same or similar to the service that NUAR would provide”.
In fact, as I said, LSBUD has been providing a safe digging service in the UK for 20 years and has grown significantly over that time. Without a plan to work more closely with LSBUD as the key industry representative, NUAR risks creating more accidental strikes of key network infrastructure, increasing risks to workers safety through electrical fires, gas leaks, pollution and so on. The public at home or at work would also suffer more service outages and disruption.
Secondly, NUAR will add costs and stifle competition. The Government claim that NUAR will deliver significant benefits to taxpayers, reduce disruption and prevent damage to underground assets, but the impact assessment ignores the fact that NUAR’s core functions are already provided through the current system—so its expected benefits are vastly overstated. While asset owners, many of whom have not been consulted, will face costs of more than £200 million over the first 10 years, the wholesale publication of asset owners’ entire networks creates commercially sensitive risks, damaging innovation and competition. Combined with the uncertainties about how quickly NUAR can gain a critical mass of users and data, this again calls into question why NUAR does not properly align with and build on the current system but instead smothers competition and harms a successful, growing UK business.
Thirdly, NUAR risks undermining control over sensitive CNI data. Underground assets are integral to critical national infrastructure; protecting them is vital to the UK’s economic and national security. LSBUD deliberately keeps data separate and ensures that data owners remain in full control over who can access their data via a secure exchange platform. NUAR, however, in aiming to provide a single view of all assets, removes providers’ control over their own data—an essential security fail-safe. It would also expand opportunities for malicious actors to target sectors in a variety of ways—for instance, the theft of copper wires from telecom networks.
NUAR shifts control over data access to a centralised government body, with no clear plan for how the data is to be protected from unauthorised access, leading to serious concerns about security and theft. Safe digging is paramount; mandating NUAR will lead to uncertainty, present more health and safety dangers to workers and the public and put critical national infrastructure at risk. These plans require further review. There needs to be, as I have said, greater alignment with industry best practice. Without further consultation, NUAR risks becoming a white elephant that undermines safe digging in the UK and increases risk to infrastructure workers and the public.
I will not go through the amendments individually as the Minister has mentioned what their effect would be, but I will dispel a few myths. The Government have claimed that NUAR has the overwhelming support of asset owners. In the view of those who briefed me, that is not an accurate reflection of the broadband and telecoms sector in particular; a number of concerns from ISPA members have been raised with the NUAR team around cost and security that have yet to be addressed. This is borne out by the fact that there are notable gaps in the major asset owners in the telecoms sector signed up to NUAR at this time.
Clearly, the noble Viscount is resisting changing the procedure by which these changes are made from negative to affirmative, but I hope I have gone some way to persuade the Committee of the importance of this change to how the NUAR system is put on a statutory footing. He talked about a “handful” of data; the comprehensive nature of the existing system is pretty impressive, and it is a free service, updated on a regular basis, which covers more than 150 asset owners and 98% of high-risk assets. NUAR currently covers only one-third of asset owners. The comparisons are already not to the advantage of NUAR.
I hope the Government will at least, even if they do not agree with these amendments, think twice before proceeding at the speed they seem to be and without the consent or taking on board the concerns of those who are already heavily engaged with Linesearch- beforeUdig who find it pretty satisfactory for their purposes.
My Lords, the Minister really did big up this section of the Bill. He said it would revolutionise this information service, that it would bring many benefits, has a green rating, would be the Formula 1 of data transfer in mapping and so on. We were led to expect quite a lot from this part of the legislation. It is an important part of the Bill, because it signifies some government progress towards the goal of creating a comprehensive national underground asset register, as he put it, or NUAR. We are happy to support this objective, but we have concerns about the progress being made and the time it is taking.
To digress a bit here, it took me back 50 years to when I was a labourer working by the side of a bypass. One of the guys I was working with was operating our post hole borer; it penetrated the Anglian Water system and sent a geyser some 20 metres up into the sky, completely destroying my midday retreat to the local pub between the arduous exercise of digging holes. Had he had one of the services on offer, I suspect that we would not have been so detained. It was quite an entertaining incident, but it clearly showed the dangers of not having good mapping.
As I understand it, and as was outlined by the noble Lord, Lord Clement-Jones, since 2018 the Government have been moving towards this notion of somewhere recording what lies below the surface in our communities. We have had street works legislation going back several decades, from at least 1991. In general, progress towards better co-ordination of utilities excavations has not been helped by poor and low levels of mapping and knowledge of what and which utilities are located underground. This is despite the various legislative attempts to make that happen, most of which have attempted to bring better co-ordination of services.
I start by thanking the noble Lords, Lord Clement-Jones and Lord Bassam, for their respective replies. As I have said, the Geospatial Commission has been engaging extensively with stakeholders, including the security services, on NUAR since 2018. This has included a call for evidence, a pilot project, a public consultation, focus groups, various workshops and other interactions. All major gas and water companies have signed up, as well as several large telecoms firms.
While the Minister is speaking, maybe the Box could tell him whether the figure of only 33% of asset owners having signed up is correct? Both I and the noble Lord, Lord Bassam, mentioned that; it would be very useful to know.
It did complete a pilot phase this year. As it operationalises, more and more will sign up. I do not know the actual number that have signed up today, but I will find out.
NUAR does not duplicate existing commercial services. It is a standardised, interactive digital map of buried infrastructure, which no existing service is able to provide. It will significantly enhance data sharing and access efficiency. Current services—
I am not sure that there is doubt over the current scope of NUAR; it is meant to address all buried infrastructure in the United Kingdom. LSBUD does make extensive representations, as indeed it has to parliamentarians of both Houses, and has spoken several times to the Geospatial Commission. I am very happy to commit to continuing to do so.
My Lords, the noble Lord, Lord Bassam, is absolutely right to be asking that question. We can go only on the briefs we get. Unlike the noble Lord, Lord Bassam, I have not been underground very recently, but we do rely on the briefings we get. LSBUD is described as a
“sustainably-funded UK success story”—
okay, give or take a bit of puff—that
“responds to most requests in 5 minutes or less”.
It has
“150+ asset-owners covering nearly 2 million km and 98% of high-risk assets—like gas, electric, and fuel pipelines”.
That sounds as though we are in the same kind of territory. How can the Minister just baldly state that NUAR is entirely different? Can he perhaps give us a paragraph on how they differ? I do not think that “completely different” can possibly characterise this relationship.
As I understand it, LSBUD services are provided on a pdf, on request. It is not interactive; it is not vector-based graphics presented on a map, so it cannot be interrogated in the same way. Furthermore, as I understand it—and I am happy to be corrected if I am misstating—LSBUD has a great many private sector asset owners, but no public sector data is provided. All of it is provided on a much more manualised basis. The two services simply do not brook comparison. I would be delighted to speak to LSBUD.
My Lords, we are beginning to tease out something quite useful here. Basically, NUAR will be pretty much an automatic service, because it will be available online, I assume, which has implications on data protection, on who owns the copyright and so on. I am sure there are all kinds of issues there. It is the way the service is delivered, and then you have the public sector, which has not taken part in LSBUD. Are those the two key distinctions?
Indeed, there are two key distinctions. One is the way that the information is provided online, in a live format, and the other is the quantity and nature of the data that is provided, which will eventually be all relevant data in the United Kingdom under NUAR, versus those who choose to sign up on LSBUD and equivalent services. I am very happy to write on the various figures. Maybe it would help if I were to arrange a demonstration of the technology. Would that be useful? I will do that.
Unlike the noble Lord, Lord Bassam, I do not have that background in seeing what happens with the excavators, but I would very much welcome that. The Minister again is really making the case for greater co-operation. The public sector has access to the public sector information, and LSBUD has access to a lot of private sector information. Does that not speak to co-operation between the two systems? We seem to have warring camps, where the Government are determined to prove that they are forging ahead with their new service and are trampling on quite a lot of rights, interests and concerns in doing so—by the sound of it. The Minister looks rather sceptical.
I am not sure whose rights are being trampled on by having a shared database of these things. However, I will arrange a demonstration, and I confidently state that nobody who sees that demonstration will have any cynicism any more about the quality of the service provided.
All I can say is that, in that case, the Minister has been worked on extremely well.
In addition to the situation that the noble Lord, Lord Bassam, described, I was braced for a really horrible situation, because these things very often lead to danger and death, and there is a very serious safety argument to providing this information reliably and rapidly, as NUAR will.
Before the Minister’s peroration, I just want to check something. He talked about the discovery project and contact with the industry; by that, I assume he was talking about asset owners as part of the project. What contact is proposed with the existing company, LinesearchbeforeUdig, and some of its major supporters? Can the Government assure us that they will have greater contact or try to align? Can they give greater assurance than they have been able to give today? Clearly, there is suspicion here of the Government’s intentions and how things will work out. If we are to achieve this safety agenda—I absolutely support it; it is the fundamental issue here—more work needs to be done in building bridges, to use another construction metaphor.
As I said, the Government have met the Geospatial Commission many times. I would be happy to meet it in order to help it adapt its business model for the NUAR future. As I said, it has attended the last three discovery workshops, allowing this data.
I close by thanking noble Lords for their contributions. I hope they look forward to the demonstration.
My Lords, I congratulate the noble Baroness, Lady Kidron, on her amendment and thank her for allowing me to add my name to it. I agree with what she said. I, too, had the benefit of a meeting with the Lord Chancellor, which was most helpful. I am grateful to Mr Paul Marshall—whom the noble Baroness mentioned and who has represented several sub-postmasters in the Horizon scandal—for his help and advice in this matter.
My first short point is that evidence derived from a computer is hearsay. There is good reason for treating hearsay evidence with caution. Computer scientists know—although the general public do not—that only the smallest and least complex computer programs can be tested exhaustively. I am told that the limit for that testing is probably around 100 lines of a well-designed and carefully written program. Horizon, which Mr Justice Fraser said was not in the least robust, consisted of a suite of programs involving millions of lines of code. It will inevitably have contained thousands of errors because all computer programs do. Most computer errors do not routinely cause malfunctions. If they did, they would be spotted at an early stage and the program would be changed—but potentially with consequential changes to the program that might not be intended or spotted.
We are all aware of how frequently we are invited to accept software updates from our mobile telephone’s software manufacturers. Those updates are not limited to security chinks but are also required because bugs—or, as we learned yesterday from Paula Vennells’s husband, anomalies and exceptions—are inevitable in computer programs. That is why Fujitsu had an office dedicated not just to altering the sub-postmasters’ balances, shocking as that is, but to altering and amending a program that was never going to be perfect because no computer program is.
The only conclusion that one can draw from all this is that computer programs are, as the noble Baroness said, inherently unreliable, such that having a presumption in law that they are reliable is unsustainable. In the case of the DPP v McKeown and Jones—in 1997, I think—Lord Hoffmann said:
“It is notorious that one needs no expertise in electronics to be able to know whether a computer is working properly”.
One must always hesitate before questioning the wisdom of a man as clever as Lord Hoffmann, but he was wrong. The notoriety now attaches to his comment.
The consequences of the repeal of Section 69 of the Police and Criminal Evidence Act 1984 have been that it reduces the burden of proof, so that Seema Misra was sent to prison in the circumstances set out by the noble Baroness. Further, this matter is urgent for two reasons; they slightly conflict with each other, but I will nevertheless set them out. The first is that for the presumption to remain in place for one minute longer means that there is a genuine risk that miscarriages of justice will continue to occur in other non-Post Office cases, from as early as tomorrow. The second is that any defence lawyer will, in any event, be treating the presumption as having been fatally undermined by the Horizon issues. The presumption will therefore be questioned in every court where it might otherwise apply. It needs consideration by Parliament.
My noble friend the Minister will say, and he will be right, that the Horizon case was a disgraceful failure of disclosure by the Post Office. But it was permitted by the presumption of the correctness of computer evidence, which I hope we have shown is unsustainable. Part of the solution to the problem may lie in changes to disclosure and discovery, but we cannot permit a presumption that we know to be unfounded to continue in law.
My noble friend may also go on to say that our amendment is flawed in that it will place impossible burdens on prosecutors, requiring them to get constant certificates of proper working from Microsoft, Google, WhatsApp, and whatever Twitter is called nowadays. Again, he may be right. We do not seek to bring prosecutions grinding to a halt, nor do we seek to question the underlying integrity of our email or communications systems, so we may need another way through this problem. Luckily, my noble friend is a very clever man, and I look forward to hearing what he proposes.
My Lords, we have heard two extremely powerful speeches; I will follow in their wake but be very brief. For many years now, I campaigned on amending the Computer Misuse Act; the noble Lord, Lord Arbuthnot, did similarly. My motivation did not start with the Horizon scandal, but was more at large because of the underlying concerns about the nature of computer evidence.
I came rather late to this understanding about the presumption of the accuracy of computer evidence. It is somewhat horrifying, the more you look into the history of this, which has been so well set out by the noble Baroness, Lady Kidron. I remember advising MPs at the time about the Police and Criminal Evidence Act. I was not really aware of what the Law Commission had recommended in terms of getting rid of Section 69, or indeed what the Youth Justice and Criminal Evidence Act did in 1999, a year after I came into this House.
The noble Baroness has set out the history of it, and how badly wrong the Law Commission got this. She set out extremely well the impact and illustration of Mrs Misra’s case, the injustice that has resulted through the Horizon cases—indeed, not just through those cases, but through other areas—and the whole aspect of the reliability of computer evidence. Likewise, we must all pay tribute to the tireless campaigning of the noble Lord, Lord Arbuthnot. I thought it was really interesting how he described computer evidence as hearsay, because that essentially is what it is, and there is the whole issue of updates and bug fixing.
The one area that I am slightly uncertain about after listening to the debate and having read some of the background to this is precisely what impact Mr Justice Fraser’s judgment had. Some people seem to have taken it as simply saying that the computer evidence was unreliable, but that it was a one-off. It seems to me that it was much more sweeping than that and was really a rebuttal of the original view the Law Commission took on the reliability of computer evidence.
My Lords, I recognise the feeling of the Committee on this issue and, frankly, I recognise the feeling of the whole country with respect to Horizon. I thank all those who have spoken for a really enlightening debate. I thank the noble Baroness, Lady Kidron, for tabling the amendment and my noble friend Lord Arbuthnot for speaking to it and—if I may depart from the script—his heroic behaviour with respect to the sub-postmasters.
There can be no doubt that hundreds of innocent sub-postmasters and sub-postmistresses have suffered an intolerable miscarriage of justice at the hands of the Post Office. I hope noble Lords will indulge me if I speak very briefly on that. On 13 March, the Government introduced the Post Office (Horizon System) Offences Bill into Parliament, which is due to go before a Committee of the whole House in the House of Commons on 29 April. The Bill will quash relevant convictions of individuals who worked, including on a voluntary basis, in Post Office branches and who have suffered as a result of the Post Office Horizon IT scandal. It will quash, on a blanket basis, convictions for various theft, fraud and related offences during the period of the Horizon scandal in England, Wales and Northern Ireland. This is to be followed by swift financial redress delivered by the Department for Business and Trade.
On the amendment laid by the noble Baroness, Lady Kidron—I thank her and the noble Lords who have supported it—I fully understand the intent behind this amendment, which aims to address issues with computer evidence such as those arising from the Post Office cases. The common law presumption, as has been said, is that the computer which has produced evidence in a case was operating effectively at the material time unless there is evidence to the contrary, in which case the party relying on the computer evidence will need to satisfy the court that the evidence is reliable and therefore admissible.
This amendment would require a party relying on computer evidence to provide proof up front that the computer was operating effectively at the time and that there is no evidence of improper use. I and my fellow Ministers, including those at the MoJ, understand the intent behind this amendment, and we are considering very carefully the issues raised by the Post Office cases in relation to computer evidence, including these wider concerns. So I would welcome the opportunity for further meetings with the noble Baroness, alongside MoJ colleagues. I was pleased to hear that she had met with my right honourable friend the Lord Chancellor on this matter.
We are considering, for example, the way reliability of evidence from the Horizon system was presented, how failures of investigation and disclosure prevented that evidence from being effectively challenged, and the lack of corroborating evidence in many cases. These issues need to be considered carefully, with the full facts in front of us. Sir Wyn Williams is examining in detail the failings that led to the Post Office scandal. These issues are not straightforward. The prosecution of those cases relied on assertions that the Horizon system was accurate and reliable, which the Post Office knew to be wrong. This was supported by expert evidence, which it knew to be misleading. The issue was that the Post Office chose to withhold the fact that the computer evidence itself was wrong.
This amendment would also have a significant impact on the criminal justice system. Almost all criminal cases rely on computer evidence to some extent, so any change to the burden of proof would or could impede the work of the Crown Prosecution Service and other prosecutors.
Although I am not able to accept this amendment for these reasons, I share the desire to find an appropriate way forward along with my colleagues at the Ministry of Justice, who will bear the brunt of this work, as the noble Lord, Lord Clement-Jones, alluded to. I look forward to meeting the noble Baroness to discuss this ahead of Report. Meanwhile, I hope she will withdraw her amendment.
Can the Minister pass on the following suggestion? Paul Marshall, who has been mentioned by all of us, is absolutely au fait with the exact procedure. He has experience of how it has worked in practice, and he has made some constructive suggestions. If there is not a full return to Section 69, there could be other, more nuanced, ways of doing this, meeting the Minister’s objections. But can I suggest that the MoJ has contact with him and discusses what the best way forward would be? He has been writing about this for some years now, and it would be extremely useful, if the MoJ has not already engaged with him, to do so.
My Lords, I am afraid that I will speak to every single one of the amendments in this group but one, which is in the name of the noble Baroness, Lady Jones, and I have signed it. We have already debated the Secretary of State’s powers in relation to what will be the commission, in setting strategic priorities for the commissioner under Clause 32 and recommending the adoption of the ICAO code of practice before it is submitted to Parliament for consideration under Clause 33:
“Codes of practice for processing personal data”.
We have also debated Clause 34:
“Codes of practice: panels and impact assessments”.
And we have debated Clause 35:
“Codes of Practice: Secretary of States recommendations”.
The Secretary of State has considerable power in relation to the new commission, and then on top of that Clause 143 and Schedule 15 to the Bill provide significant other powers for the Secretary of State to interfere with the objective and impartial functioning of the information commission by the appointment of non-executive members of the newly formed commission. The guarantee of the independence of the ICO is intended to ensure the effectiveness and reliability of its regulatory function and that the monitoring and enforcement of data protection laws are carried out objectively and free from partisan or extra-legal considerations.
These amendments would limit the Secretary of State’s powers and leeway to interfere with the objective and impartial functioning of the new information commission, in particular by modifying Schedule 15 to the Bill to transfer budget responsibility and the appointment process of the non-executive members of the information commission to the relevant Select Committee. If so amended, the Bill would ensure that the new information commission has sufficient arm’s-length distance from the Government to oversee public and private bodies’ uses of personal data with impartiality and objectivity. DSIT’s delegated powers memorandum to the DPRRC barely mentions any of these powers, yet they are of considerable importance. Therefore, I am not surprised that there was no mention of them, but they are very significant.
We have discussed data adequacy before; of course, in his letter to us, the Minister tried to rebut some of the points we made about it. In fact, he quoted somebody who has briefed me extensively on it and has taken a very different view to the one he alleges she took in a rather partial quotation from evidence taken by the European Affairs Committee, which is now conducting an inquiry into data adequacy and its implications for the UK-EU relationship. We were told by Open Rights Group attendees at a recent meeting with the European Commission that it expressed concern to those present about the risk that the Bill poses to the EU adequacy agreement; this was not under Chatham House rules. It expressed this risk in a meeting at which a number of UK groups were present, which is highly significant in itself.
I mentioned the European Affairs Committee’s inquiry. I understand that the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs has also given written evidence on its concerns about this Bill, its impact on adequacy and how it could impact the agreement. It put its arguments rather strongly. Has the Minister seen this? Is he aware of the written evidence that it has given to the European Affairs Select Committee? I suggest that he becomes aware of it and takes a view on whether we need to postpone Report until we have seen the European Affairs Select Committee’s report. If it comes to the conclusion that data adequacy is at risk, the Government will have to go back to the drawing board in a number of respects on this Bill. If the Select Committee report comes out and says that the impact of the Bill will not be data adequate, it would be rather foolish if we had already gone through Report by that time. Far be it from me not to want the Government to have egg on their face but it would be peculiar if they did not carefully observe the evidence being put to the European Affairs Select Committee and the progress that it is making in its inquiry. I beg to move.
My Lords, I thank the noble Lord, Lord Clement-Jones, for introducing his amendments so ably. When I read them, I had a strong sense of déjà vu as attempts by the Government to control the appointments and functioning of new regulators have been a common theme in other pieces of legislation that we have debated in the House and which we have always resisted. In my experience, this occurred most recently in the Government’s proposals for the Office for Environmental Protection, which was dealing with EU legislation being taken into by the UK and is effectively the environment regulator. We were able to get those proposals modified to limit the Secretary of State’s involvement; we should do so again here.
I very much welcome the noble Lord’s amendments, which give us a chance to assess what level of independence would be appropriate in this case. Schedule 15 covers the transition from the Information Commissioner’s Office to the appointment of the chair and non-executive members of the new information commission. We support this development in principle but it is crucial that the new arrangements strengthen rather than weaken the independence of the new commission.
The noble Lord’s amendments would rightly remove the rights of the Secretary of State to decide the number of non-executive members and to appoint them. Instead, his amendments propose that the chair of the relevant parliamentary committee should oversee appointments. Similarly, the amendments would remove the right of the Secretary of State to recommend the appointment and removal of the chair; again, this should be passed to the relevant parliamentary committee. We agree with these proposals, which would build in an additional tier of parliamentary oversight and help remove any suspicion that the Secretary of State is exercising unwarranted political pressure on the new commission.
The noble Lord’s amendments beg the question of what the relevant parliamentary committee might be. Although we are supportive of the wording as it stands, it is regrettable that we have not been able to make more progress on establishing a strong bicameral parliamentary committee to oversee the work of the information commission. However, in the absence of such a committee, we welcome the suggestion made in the noble Lord’s Amendment 256 that the Commons Science, Innovation and Technology Committee could fulfil that role.
Finally, we have tabled Amendment 259, which addresses what is commonly known as the “revolving door” whereby public sector staff switch to jobs in the private sector and end up working for industries that they were supposedly investigating and regulating previously. This leads to accusations of cronyism and corruption; whether or not there is any evidence of this, it brings the reputation of the whole sector into disrepute. Perhaps I should have declared an interest at the outset: I am a member of the Advisory Committee on Business Appointments and therefore have a ringside view of the scale of the revolving door taking place, particularly at the moment. We believe that it is time to put standards in public life back at the heart of public service; setting new standards on switching sides should be part of that. Our amendment would put a two-year ban on members of the information commission accepting employment from a business that was subject to enforcement action or acting for persons who are being investigated by the agency.
I hope that noble Lords will see the sense and importance of these amendments. I look forward to the Minister’s response.
My Lords, I thank the Minister for his response, dusty though it may have been. The noble Baroness, Lady Jones, is absolutely right; this Government have form in all areas of regulation. In every area where we have had legislation related to a regulator coming down the track, the Government have taken more power on and diminished parliamentary oversight rather than enhancing it.
It is therefore a little rich to say that accountability to Parliament is the essence of all this. That is not the impression one gets reading the data protection Bill; the impression you get is that the Government are tightening the screw on the regulator. That was the case with Ofcom in the Online Safety Act; it is the case with the CMA; the noble Baroness, Lady Jones, mentioned her experience as regards the environment. Wherever you look, the Government are tightening their control over the regulators. It is something the Industry and Regulators Committee has been concerned about. We have tried to suggest various formulae. A Joint Committee of both Houses was proposed by the Communications and Digital Committee; it has been endorsed by a number of other committees, such as the Joint Committee on the Draft Online Safety Bill, and I think it has even been commended by the Industry and Regulators Committee as well in that respect.
We need to crack this one. On the issue of parliamentary accountability for the regulator and oversight, the balance is not currently right. That applies particularly in terms of appointments, in this case of the commissioner and the non-executives. The Minister very conveniently talked about removal but this could be about renewal of term, and it is certainly about appointment. So maybe the Minister was a little bit selective with the example he chose to say where the control was.
We are concerned about the independence of the regulator. The Minister did not give an answer, so I hope that he will write about whether he knows what the European Affairs Select Committee is up to. I made a bit of a case on that. Evidence is coming in, and the relevant committee in the European Parliament is giving evidence. The Minister, the noble Viscount, Lord Camrose, was guilty of this in a way, but the way that the data adequacy aspect is seen from this side of the North Sea seems rather selective. The Government need to try to try to put themselves in the position of the Commission and the Parliament on the other side of the North Sea and ask, “What do we think are the factors that will endanger our data adequacy as seen from that side?” The Government are being overly complacent in regarding it as “safe” once the Bill goes through.
It was very interesting to hear what the noble Baroness had to say about the revolving door issues. The notable thing about this amendment is how limited it is; it is not blanket. It would be entirely appropriate to have this in legislation, given the sensitivity of the roles that are carried out by senior people at the ICO.
However, I think we want to make more progress tonight, so I beg leave to withdraw my amendment.
My Lords, as ever, the noble Baroness, Lady Kidron, has nailed this issue. She has campaigned tirelessly in the field of child sexual abuse and has identified a major loophole.
What has been so important is learning from experience and seeing how these new generative AI models, which we have all been having to come to terms with them for the past 18 months, are so powerful in the hands of ordinary people who want to cause harm and sexual abuse. The important thing is that, under existing legislation, there are of course a number of provisions relating to creating deepfake child pornography, the circulation of pornographic deepfakes and so on. However, as the noble Baroness said, what the legislation does not do is go upstream to the AI system—the AI model itself—to make sure that those who develop those models are caught as well. That is what a lot of the discussion around deepfakes is about at the moment—it is, I would say, the most pressing issue—but it is also about trying to nail those AI system owners and users at the very outset, not waiting until something is circulated or, indeed, created in the first place. We need to get right up there at the outset.
I very much support what the noble Baroness said; I will reserve any other remarks for the next group of amendments.
My Lords, I am pleased that we were able to sign this amendment. Once again, the noble Baroness, Lady Kidron, has demonstrated her acute ability to dissect and to make a brilliant argument about why an amendment is so important.
As the noble Lord, Lord Clement-Jones, and others have said previously, what is the point of this Bill? Passing this amendment and putting these new offences on the statute book would give the Bill the purpose and clout that it has so far lacked. As the noble Baroness, Lady Kidron, has made clear, although it is currently an offence to possess or distribute child sex abuse material, it is not an offence to create these images artificially using AI techniques. So, quite innocent images of a child—or even an adult—can be manipulated to create child sex abuse imagery, pornography and degrading or violent scenarios. As the noble Baroness pointed out, this could be your child or a neighbour’s child being depicted for sexual gratification by the increasingly sophisticated AI creators of these digital models or files.
Yesterday’s report from the Internet Watch Foundation said that a manual found on the dark web encourages “nudifying” tools to remove clothes from child images, which can then be used to blackmail them into sending more graphic content. The IWF reports that the scale of this abuse is increasing year on year, with 275,000 web pages containing child sex abuse being found last year; I suspect that this is the tip of the iceberg as much of this activity is occurring on the dark web, which is very difficult to track. The noble Baroness, Lady Kidron, made a powerful point: there is a danger that access to such materials will also encourage offenders who then want to participate in real-world child sex abuse, so the scale of the horror could be multiplied. There are many reasons why these trends are shocking and abhorrent. It seems that, as ever, the offenders are one step ahead of the legislation needed for police enforcers to close down this trade.
As the noble Baroness, Lady Kidron, made clear, this amendment is “laser focused” on criminalising those who are developing and using AI to create these images. I am pleased to say that Labour is already working on a ban on creating so-called nudification tools. The prevalence of deepfakes and child abuse on the internet is increasing the public’s fear of the overall safety of AI, so we need to win their trust back if we are to harness the undoubted benefits that it can deliver to our public services and economy. Tackling this area is one step towards that.
Action to regulate AI by requiring transparency and safety reports from all those at the forefront of AI development should be a key part of that strategy, but we have a particular task to do here. In the meantime, this amendment is an opportunity for the Government to take a lead on these very specific proposals to help clean up the web and rid us of these vile crimes. I hope the Minister can confirm that this amendment, or a government amendment along the same lines, will be included in the Bill. I look forward to his response.
My Lords, I will speak to all the amendments in this group, other than Amendment 295 from the noble Baroness, Lady Jones. Without stealing her thunder, I very much support it, especially in an election year and in the light of the deepfakes we have already seen in the political arena—those of Sadiq Khan, those used in the Slovakian election and the audio deepfakes of the President of the US and Sir Keir Starmer. This is a real issue and I am delighted that she has put down this amendment, which I have signed.
In another part of the forest, the recent spread of deepfake photos purporting to show Taylor Swift engaged in explicit acts has brought new attention to the use, which has been growing in recent years, of deepfake images, video and audio to harass women and commit fraud. Women constitute 99% of the victims and the most visited deepfake site had 111 million users in October 2023. More recently, children have been found using “declothing” apps, which I think the noble Baroness mentioned, to create explicit deepfakes of other children.
Deepfakes also present a growing threat to elections and democracy, as I have mentioned, and the problems are increasingly rampant. Deepfake fraud rates rose by 3,000% globally in 2023, and it is hardly surprising that, in recent polling, 86% of the UK population supported a ban on deepfakes. I believe that the public are demanding an urgent solution to this problem. The only effective way to stop deepfakes, which is analogous to what the noble Baroness, Lady Kidron, has been so passionately advocating, is for the Government to ban them at every stage, from production to distribution. Legal liability must hold to account those who produce deepfake technology, create and enable deepfake content, and facilitate its spread.
Existing legislation seeks to limit the spread of images on social media, but this is not enough. The recent images of Taylor Swift were removed from X and Telegram, but not before one picture had been viewed more than 47 million times. Digital watermarks are not a solution, as shown by a paper by world-leading Al researchers released in 2023, which concluded that
“strong and robust watermarking is impossible to achieve”.
Without measures across the supply chain to prevent the creation of deepfakes, the law will forever be playing catch-up.
The Government now intend to ban the creation of sexual imagery deepfakes; I welcome this and have their announcement in my hand:
“Government cracks down on ‘deepfakes’ creation”.
This will send a clear message that the creation of these intimate images is not acceptable. However, this appears to cover only sexual image deepfakes. These are the most prevalent form of deepfakes, but other forms of deepfakes are also causing noticeable and rapidly growing harms, most obviously political deepfakes—as the noble Baroness, Lady Jones, will illustrate—and deepfakes used for fraud. This also appears to cover only the endpoint of the creation of deepfakes, not the supply chain leading up to that point. There are whole apps and companies dedicated to the creation of deepfakes, and they should not exist. There are industries which provide legitimate services—generative Al and cloud computing—which fail to take adequate measures and end up enabling creation of deepfakes. They should take measures or face legal accountability.
The Government’s new measures are intended to be introduced through an amendment to the Criminal Justice Bill, which is, I believe, currently between Committee and Report in the House of Commons. As I understand it, however, there is no date scheduled yet for Report, as the Bill seems to be caught in a battle over amendments.
The law will, however, be extremely difficult to enforce. Perpetrators are able to hide behind anonymity and are often difficult to identify, even when victims or authorities are aware that deepfakes have been created. The only reliable and effective countermeasure is to hold the whole supply chain responsible for deepfake creation and proliferation. All parties involved in the AI supply chain, from AI model developers and providers to cloud compute providers, must demonstrate that they have taken steps to preclude the creation of deepfakes. This approach is similar to how society combats—or, rather, analogous to the way that I hope the Minister will concede to the noble Baroness, Lady Kidron, society will combat—child abuse material and malware.
My Lords, I speak to Amendments 293 and 294 from the noble Lord, Lord Clement-Jones, Amendment 295 proposed by my noble friend Lady Jones and Amendments 295A to 295F, also in the name of the noble Lord, Lord Clement-Jones.
Those noble Lords who are avid followers of my social media feeds will know that I am an advocate of technology. Advanced computing power and artificial intelligence offer enormous opportunities, which are not all that bad. However, the intentions of those who use them can be malign or criminal, and the speed of technological developments is outpacing legislators around the world. We are constantly in danger of creating laws that close the stable door long after the virtual horse has bolted.
The remarkable progress of visual and audio technology has its roots in the entertainment industry. It has been used to complete or reshoot scenes in films in the event of actors being unavailable, or in some cases, when actors died before filming was completed. It has also enabled filmmakers to introduce characters, or younger versions of iconic heroes for sequels or prequels in movie franchises. This enabled us to see a resurrected Sir Alec Guinness and a younger version of Luke Skywalker, or a de-aged Indiana Jones, on our screens.
The technology that can do this is only around 15 years old, and until about five years ago it required extremely powerful computers, expensive resources and advanced technical expertise. The first malicious use of deepfakes occurred when famous actors and celebrities, mainly and usually women, had their faces superimposed on to bodies of participants in pornographic videos. These were then marketed online as Hollywood stars’ sex tapes or similar, making money for the producers while causing enormous distress to the women targeted. More powerful computer processors inevitably mean that what was once very expensive rapidly becomes much cheaper very quickly. An additional factor has turbo-boosted this issue: generative AI. Computers can now learn to create images, sound and video movement almost independently of software specialists. It is no longer just famous women who are the targets of sexually explicit deepfakes; it could be anyone.
Amendment 293 directly addresses this horrendous practice, and I hope that there will be widespread support for it. In an increasingly digital world, we spend more time in front of our screens, getting information and entertainment on our phones, laptops, iPads and smart TVs. What was once an expensive technology, used to titillate, entertain or for comedic purposes, has developed an altogether darker presence, well beyond the reach of most legislation.
In additional to explicit sexual images, deepfakes are known to have been used to embarrass individuals, misrepresent public figures, enable fraud, manipulate public opinion and influence democratic political elections and referendums. This damages people individually: those whose images or voices are faked, and those who are taken in by the deepfakes. Trusted public figures, celebrities or spokespeople face reputational and financial damage when their voices or images are used to endorse fake products or for harvesting data. Those who are encouraged to click through are at risk of losing money to fraudsters, being targeted for scams, or having their personal and financial data leaked or sold on. There is growing evidence that information used under false pretences can be used for profiling in co-ordinated misinformation campaigns, for darker financial purposes or political exploitation.
In passing, it is worth remembering that deepfakes are not always images of people. Last year, crudely generated fake images of an explosion, purported to be at the Pentagon, caused the Dow Jones industrial average to drop 85 points within four minutes of the image being published, and triggered emergency response procedures from local law enforcement before it was debunked 20 minutes later. The power of a single image, carefully placed and virally spreading, shows the enormous and rapid economic damage that deepfakes can create.
Amendment 294 would make it an offence for a person to generate a deepfake for the purpose of committing fraud, and Amendment 295 would make it an offence to create deepfakes of political figures, particularly when they risk undermining electoral integrity. We support all the additional provisions in this group of amendments; Amendments 295A to 295F outline the requirements, duties and definitions necessary to ensure that those creating deepfakes can be prosecuted.
I bring to your Lordships’ attention the wording of Amendment 295, which, as well as making it an offence to create a deepfake, goes a little further. It also makes it an offence to send a communication which has been created by artificial intelligence and which is intended to create the impression that a political figure has said or done something that is not based in fact. This touches on what I believe to be a much more alarming aspect of deepfakes: the manner in which false information is distributed.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones of Whitchurch, for tabling the amendments in this important group. I very much share the concerns about all the uses of deepfake images that are highlighted by these amendments. I will speak more briefly than I otherwise would with a view to trying to—
My Lords, I would be very happy to get a letter from the Minister.
I would be happy to write one. I will go for the abbreviated version of my speech.
I turn first to the part of the amendment that would seek to criminalise the creation, alteration or otherwise generation of deepfake images depicting a person engaged in an intimate act. The Government recognise that there is significant public concern about the simple creation of sexually explicit deepfake images, and this is why they have announced their intention to table an amendment to the Criminal Justice Bill, currently in the other place, to criminalise the creation of purposed sexual images of adults without consent.
The noble Lord’s Amendment 294 would create an offence explicitly targeting the creation or alteration of deepfake content when a person knows or suspects that the deepfake will be or is likely to be used to commit fraud. It is already an offence under Section 7 of the Fraud Act 2006 to generate software or deepfakes known to be designed for or intended to be used in the commission of fraud, and the Online Safety Act lists fraud as a priority offence and as a relevant offence for the duties on major services to remove paid-for fraudulent advertising.
Amendment 295 in the name of the noble Baroness, Lady Jones of Whitchurch, seeks to create an offence of creating or sharing political deepfakes. The Government recognise the threats to democracy that harmful actors pose. At the same time, the UK also wants to ensure that we safeguard the ability for robust debate and protect freedom of expression. It is crucial that we get that balance right.
Let me first reassure noble Lords that the UK already has criminal offences that protect our democratic processes, such as the National Security Act 2023 and the false communications offence introduced in the Online Safety Act 2023. It is also already an election offence to make false statements of fact about the personal character or conduct of a candidate or about the withdrawal of a candidate before or during an election. These offences have appropriate tests to ensure that we protect the integrity of democratic processes while also ensuring that we do not impede the ability for robust political debate.
I assure noble Lords that we continue to work across government to ensure that we are ready to respond to the risks to democracy from deepfakes. The Defending Democracy Taskforce, which seeks to protect the democratic integrity of the UK, is engaging across government and with Parliament, the UK’s intelligence community, the devolved Administrations, local authorities and others on the full range of threats facing our democratic institutions. We also continue to meet regularly with social media companies to ensure that they continue to take action to protect users from election interference.
Turning to Amendments 295A to 295F, I thank the noble Lord, Lord Clement-Jones, for them. Taken together, they would in effect establish a new regulatory regime in relation to the creation and dissemination of deepfakes. The Government recognise the concerns raised around harmful deepfakes and have already taken action against illegal content online. We absolutely recognise the intention behind these amendments but they pose significant risks, including to freedom of expression; I will write to noble Lords about those in order to make my arguments in more detail.
For the reasons I have set out, I am not able to accept these amendments. I hope that the noble Lord will therefore withdraw his amendment.
My Lords, I thank the Minister for that rather breathless response and his consideration. I look forward to his letter. We have arguments about regulation in the AI field; this is, if you like, a subset of that—but a rather important subset. My underlying theme is “must try harder”. I thank the noble Lord, Lord Leong, for his support and pay tribute to Control AI, which is vigorously campaigning on this subject in terms of the supply chain for the creation of these deepfakes.
Pending the Minister’s letter, which I look forward to, I beg leave to withdraw my amendment.
My Lords, what a relief—we are at the final furlong.
The UK is a world leader in genomics, which is becoming an industry of strategic importance for future healthcare and prosperity, but, frankly, it must do more to protect the genomic sector from systemic competitors that wish to dominate this industry for either economic advantage or nefarious purposes. Genomic sequencing—the process of determining the entirety of an organism’s DNA—is playing an increasing role in our NHS, which has committed to being the first national healthcare system to offer whole-genome sequencing as part of routine care. However, like other advanced technologies, our sector is exposed to data privacy and national security risks. Its dual-use potential means that it can also be used to create targeted bioweapons or genetically enhanced military. We must ensure that a suitable data protection environment exists to maintain the UK’s world-leading status.
So, how are we currently mitigating against such threats and why is our existing approach so flawed? Although I welcome initiatives such as the Trusted Research campaign and the Research Collaboration Advice Team, these bodies focus specifically on research and academia. We expect foreign companies that hold sensitive genomics and DNA to follow GDPR. I am not a hawk about relations with other countries, but we need to provide the new Information Commissioner with much greater expertise and powers to tackle complex data security threats in sensitive industries. There must be no trade-off between scientific collaboration and data privacy; that is what this amendment is designed to prevent. I beg to move.
The Committee will be relieved to know that I will be brief. I do not have much to say because, in general terms, this seems an eminently sensible amendment.
We should congratulate the noble Lord, Lord Clement-Jones, on his drafting ingenuity. He has managed to compose an amendment that brings together the need for scrutiny of emerging national security and data privacy risks relating to advanced technology, aims to inform regulatory developments and guidance that might be required to mitigate risks, and would protect the privacy of people’s genomics data. It also picks up along the way the issue of the security services scrutinising malign entities and guiding researchers, businesses, consumers and public bodies. Bringing all those things together at the end of a long and rather messy Bill is quite a feat—congratulations to the noble Lord.
I am rather hoping that the Minister will tell the Committee either that the Government will accept this wisely crafted amendment or that everything it contains is already covered. If the latter is the case, can he point noble Lords to where those things are covered in the Bill? Can he also reassure the Committee that the safety and security issues raised by the noble Lord, Lord Clement-Jones, are covered? Having said all that, we support the general direction of travel that the amendment takes.
Nothing makes me happier than the noble Lord’s happiness. I thank him for his amendment and the noble Lord, Lord Bassam, for his points; I will write to them on those, given the Committee’s desire for brevity and the desire to complete this stage tonight.
I wish to say some final words overall. I sincerely thank the Committee for its vigorous—I think that is the right word—scrutiny of this Bill. We have not necessarily agreed on a great deal, but I am in awe of the level of scrutiny and the commitment to making the Bill as good as possible. Let us be absolutely honest—this is not the most entertaining subject, but it is something that we all take extremely seriously and I pay tribute to the Committee for its work. I also extend sincere thanks to the clerks and our Hansard colleagues for agreeing to stay a little later than agreed, although that may not even be necessary. I very much look forward to engaging with noble Lords again before and during Report.
My Lords, I thank the Minister, the noble Baroness, Lady Jones, and all the team. I also thank the noble Lord, Lord Harlech, whose first name we now know; these things are always useful to know. This has been quite a marathon. I hope that we will have many conversations between now and Report. I also hope that Report is not too early as there is a lot to sort out. The noble Baroness, Lady Jones, and I will be putting together our priority list imminently but, in the meantime, I beg leave to withdraw my amendment.